Agency information collection activities: Submission for OMB review; comment request,

[Federal Register: December 5, 2001 (Volume 66, Number 234)]

[Notices]

[Page 63250-63255]

From the Federal Register Online via GPO Access [wais.access.gpo.gov]

[DOCID:fr05de01-78]

DEPARTMENT OF THE INTERIOR

Office of the Secretary

Office of Planning and Performance Management; Agency Information Collection Activities: Submitted for Office of Management and Budget (OMB) Review; Comment Request

AGENCY: Department of the Interior.

ACTION: Notice of new information collection survey.

SUMMARY: To comply with the requirements of the Paper Reduction Act (PRA) of 1995, we are submitting to OMB for review and approval an information collection request (ICR) for the Department of the Interior (DOI) to conduct voluntary customer satisfaction surveys to gather input and feedback from the public. The ICR is entitled ``DOI Programmatic Clearance for Customer Satisfaction Surveys.'' We are also soliciting comments from the public on this ICR.

DATES: Please submit written comments by January 4, 2002.

ADDRESSES: You may submit comments directly to the Office of Information and Regulatory Affairs, OMB, Attention: Desk Officer for the Department of the Interior (OMB Control Number 1040-NEW), 725 17th Street, NW., Washington, DC 20503. Mail or handcarry a copy of your comments to the Department of the Interior; Office of Planning and Performance Management; Mail Stop 5258-MIB; 1849 C Street, NW., Washington, DC 20240. If you wish to email comments, the email address is: Norma_Campbell@os.doi.gov. Reference ``DOI Programmatic Clearance for Customer Satisfaction Surveys'' in your email subject line. Include your name and return address in your email message and mark your message for return receipt.

FOR FURTHER INFORMATION CONTACT: Norma Campbell, Office of Planning and Performance Management, telephone (202) 208-1818. You also may contact this office to obtain at no cost a copy of the collection of information that will be submitted to OMB.

SUPPLEMENTARY INFORMATION:

Title: DOI Programmatic Clearance for Customer Satisfaction Surveys.

OMB Control Number: 1040-NEW.

Abstract: The mission of DOI is to protect and provide access to our Nation's natural and cultural heritage and honor our trust responsibilities to Indian Tribes and our commitments to island communities. DOI's Strategic Plan Overview (FY 2000-2005) lays out five goals as a framework for this work: (1) Protect the environment and preserve our Nation's natural and cultural resources; (2) provide recreation for America; (3) manage natural resources for a healthy environment and our strong economy; (4) provide science for a changing world; and (5) meet our trust responsibilities to Indian Tribes and our commitments to island communities. Each bureau' s plan also contains goals requiring collaboration with the public--our partners and customers. Part of this communication occurs through occasional surveys of the different users and stakeholders of DOI's products and services.

In the spirit of the PRA, DOI is consolidating its ICRs related to customer surveys for all participating offices and bureaus into one programmatic ICR. This single ICR will ease the public burden by submitting a generic format and set of standards that all customer survey-related collections would follow in DOI. Because the participating bureaus and offices have differing customer and stakeholder groups, there will not be one ``boiler-plate'' approach to customer research. The ICR will describe those differences, where apparent. Although, where applicable, similar questions will be asked in the surveys of the bureaus and offices to allow better benchmarking throughout DOI.

Background

The Government Performance and Results Act (GPRA) of 1993 (Pub.L. No. 103-62) sets out to ``improve Federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction'' (Section 2.b.3). In order to fulfill this responsibility, DOI's bureaus and offices must collect data from their respective user groups to (1) better understand the needs and desires of the public and (2) respond to those needs and desires accordingly.

This course of action is fortified by Executive Order (E.O.) 12862 (September 11, 1993) aimed at ``ensuring the Federal Government provides the highest quality service possible to the American people.'' The E.O. discusses surveys as a means for determining the kinds and qualities of service desired by the Federal Government's customers and for determining satisfaction levels for existing service. These voluntary customer surveys will be used to ascertain customer satisfaction with DOI's bureaus and offices in terms of services and products. Previous customer surveys have provided useful information to DOI's bureaus and offices for assessing how well we deliver our services and products, making improvements, and reporting on annual performance goals as set out in GPRA-related documents. The results are used internally, and summaries are provided to OMB on an annual basis and are used to satisfy the requirements and spirit of E.O. 12862.

Furthermore, E.O. 12862 requires agencies to provide a ``means to address customer complaints.'' To that end, bureaus and offices may use customer comment cards as an opportunity for customers to provide feedback to the agencies on the service they have received.

More recently, President Bush's Management Agenda for 2001 calls for citizen-centered government. The

[[Page 63251]]

Secretary of the Interior's August 3, 2001 memorandum, ``Management Excellence and Citizen-Centered Service,'' directs bureaus and offices to focus on citizen-centered governance. The proposed OMB Guideline for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Dissemination by Federal Agencies would require agencies to submit annual reports ``detailing the number and nature of complaints received by the agency regarding agency compliance with these OMB guidelines.'' Comment cards and other survey methods facilitated by this programmatic clearance would provide valuable information to assist DOI's bureaus and offices in following the Administration's guidance.

In addition to GPRA and E.O. 12862, the statutes, regulations, and Secretarial Orders that created each of the bureaus and offices further enhance the need to engage the public and deliver quality products and services to our customers.

The participating DOI bureaus and offices anticipate performing their customer surveys under one ICR. Under this proposal, DOI would request that OMB review the procedures and question areas for these surveys as a program, rather than reviewing each survey individually. Under the procedures proposed here, DOI would conduct the necessary quality control (through a ``secondary office of control'' within DOI), including assurances that the individual survey comports with the guidelines in this proposed programmatic ICR, and submit the particular survey instruments and methodologies for expedited review to OMB.

Participating Bureaus and Offices

The proposed ICR covers most of the organizational agencies in DOI. However, the National Park Service, which has one of the most mature customer survey programs in the Federal Government, will continue under its own separate clearance given the complexity and specificity of its program. The participating bureaus and offices covered under the proposed ICR include:

Bureau of Indian Affairs

Bureau of Land Management (BLM)

Bureau of Reclamation

US Fish & Wildlife Service

Office of Insular Affairs

Minerals Management Service

Office of the Secretary

Office of Surface Mining

US Geological Survey (USGS)

Current Actions

The request to OMB will be for a 3-year clearance to conduct customer satisfaction surveys in the participating DOI bureaus and offices. USGS and BLM, who have developed customer research programs, are currently operating under 3-year programmatic clearances. Other participating bureaus and offices have handled their ICRs on a case-by- case basis.

For example, under existing approvals, USGS in 2000 surveyed users of the on-line National Atlas, State and Federal land managing and natural resource agencies, customers of Eros Data Center (digital data and maps), and customers of Earth Science Information Centers (topographic maps, USGS publications). Over the last 3 years, BLM has surveyed users of recreation areas, grazing permittees, oil and gas permittees, stakeholders and partners, and public room users, as well as conducted focus groups with various customer groups. These collections occur through one of six methodologies: (1) Intercept (a customer interacting in person with one conducting the survey); (2) telephone interviews; (3) mail surveys; (4) web-based surveys; (5) focus groups; and (6) voluntary use of comment cards.

Examples of previously conducted customer surveys are available upon request. Our planned activities in the next 3 fiscal years reflect our increased emphasis on and expansion of these activities throughout DOI.

Methodology

In all customer research, the goal of DOI is to employ the best statistical models that, in turn, will lead to the best data from which sound management decisions can be made. Therefore, an 80 percent response rate has been set for all customer surveys, with a 70 percent response rate as base threshold.

Different user and stakeholder groups function and interact with the respective bureaus and offices in different ways. In order to meet the response rate goal, six different methodologies will be available for use. The methodology will be chosen based on achieving statistical accuracy while keeping the cost as low as possible. The six methodologies that DOI's bureaus and offices will employ are: (1) Intercept, (2) telephone interviews, (3) mail surveys, (4) web-based surveys, (5) focus groups, and (6) comment cards. In all cases, the goal is to achieve the 95 percent confidence level with a sampling error no greater than 5 percent. The total number of respondents sought for each survey will be based on achieving this level. In most cases, the respondent base will be pulled from a randomized sample of the user population, and where necessary, a stratified sample will be used to achieve accurate statistical measures at the appropriate National, State, or regional level. In some cases where the user population is small, the entire population will need to be surveyed.

Intercept: In a face-to-face situation, the survey instrument is provided to a respondent who completes it while on site and then returns it. The survey proctor is prepared to answer any questions the respondent may have about how to fill out the instrument but does not interfere or influence how the respondents answer the questions. This methodology provides the highest response rate--typically between 80-85 percent.

Telephone: Using existing databases, an interviewer will contact customers who have had a specific experience with the agency. The interviewer will dial back until the customer has been reached. Once contacted, the survey respondent is given a brief introduction to the survey, including its importance and use. The interviewer will then expeditiously move through the survey questions. When this methodology is employed, the typical response rate is between 70 and 85 percent, depending on the customer group.

Mail: Using existing lists of customer addresses, a three contact- approach based on Dillman's ``Tailored Design Method'' will be employed. The first contact is a cover letter explaining that a survey is coming to them and why it is important to the agency. The second contact will be the survey instrument itself along with a postage-paid addressed envelope to return the survey. The third contact will be a reminder postcard sent 10 days after the survey was sent. Finally, the respondents will receive a letter thanking them for the willingness to participate in the survey and reminding them to return it if they have not already done so. At each juncture, the respondents will be given multiple ways to contact someone with questions regarding the survey (including phone, FAX, web, and email). If the survey has been lost, the respondent can request that another be sent to them. Electronic mail is sometimes used instead of postal mail to communicate with customers. Although this is a cost-effective mode to survey a large group of people, it does not usually generate the best response rate. Telephone calls to non-respondents can be used to increase response rates.

Web-based: For products or services that are provided through electronic means, whether e-commerce or web-based information, a web or email survey may be most appropriate. During

[[Page 63252]]

the course of their web interaction, users can volunteer to add their name to a list of future surveys. From this list, a respondent pool will be selected in accordance with the sampling procedures outlined above. An email will be sent to them explaining the need and importance of the survey with a web link to the survey. Within 5 days, a follow-up email will be sent to the respondents reminding them to complete the survey. Finally, the respondents will receive an email thanking them for the willingness to participate in the survey and reminding them to complete it if they have not already. The respondent will always have the option to submit the survey in paper form, should they elect to do so.

Focus Groups: Some data and information are best collected through more subjective, conversational means. A focus group is an informal, small-group discussion designed to obtain in-depth qualitative information. Individuals are specifically invited to participate in the discussion, whether in person or through technologically enhanced means (i.e., video conferencing, on-line sessions). Participants are encouraged to talk with each other about their experiences, preferences, needs, observations, or perceptions. A moderator whose role is to foster interaction leads the conversation. The moderator makes sure that all participants are encouraged to contribute and that no individual dominates the conversation. Furthermore, the moderator manages the discussion to make sure it does not stray too far from the topic of interest. Focus groups are most useful in an exploratory stage or when the bureaus and offices want to develop a deeper understanding of a program or service.

Using the best in focus group research practices, groups will be constructed to include a cross-section of a given customer group. The questions and additional probes used during the focus groups will be consistent with the ``guideline menu'' discussed below.

Comment Cards: As discussed in the Background section above, agencies have been instructed to provide a means to address customer complaints. To facilitate this, comment cards may be employed. Comment cards, when provided to a customer at the time a product or service is provided, offer an excellent means to give the bureaus and offices feedback. A comment card should have a limited number of questions and an opportunity to comment. These comment cards provide managers and service providers with direct, specific, and timely information from their customers about new service problems as they crop up, or extraordinary performance, that could not be obtained through any other means.

Electronic users may be offered the opportunity to complete a comment card via a ``pop-up'' window (or other web-enabled means that may be available). The ``pop-up'' window will not appear for every user; rather, the users will be randomly selected to receive the survey. This practice is widely used in private industry. In other instances, the electronic user may be offered the option to self-select in answering the electronic comment card.

Whether using paper or electronic comment cards, the intent is to provide a feedback mechanism. The data are not intended to be statistically significant. Although questions may include numeric scales, those data should be considered only in an anecdotal fashion and not reported as a significant measure.

Remuneration/Incentives: A great deal of the literature related to customer satisfaction research recommends that incentives, monetary and non-monetary, be used to increase response rates (see D. Dillman publications, specifically Mail and Internet Surveys, 2000). Although bureaus and offices acting in wholly a regulatory role would not seek to provide remuneration to their permittees, bureaus and offices that operate in a more service-related mode may find incentives to be both helpful and appropriate. Therefore, DOI proposes to handle remuneration/incentives on a case-by-case basis as part of the expedited OMB review (i.e., the 10-day expedited OMB review). An agency may propose non-monetary incentives; such as a discount at an on-site book store, a small souvenir, or complimentary access to a facility/ site.

Topic Areas: The participating bureaus and offices propose to survey customers in the following general categories:

Authorized public land uses (i.e., rights-of-way, land management transactions, mining, recreation, oil and gas, grazing, wildlife photographers, hunters, and fishers)

Coal operators

Contractors/venders

Disabled persons and groups representing disabled persons

Educators/researchers

Environmental groups

Governments representatives (State, local, and foreign)

Grant recipients

Indian Tribes/Alaskan Natives/Native Americans

Industry groups (i.e., mining, oil and gas)

Insular governments

Interested publics/special interest groups (i.e., Friends groups for wildlife refuges)

Law enforcement authorities, custom brokers, and brokers' associations

Local communities

Private and public land stakeholders (i.e., hunting, fishing, farming, banking, legal, real estate representatives, and land trust operators)

Public information center users

Scientific data users and technical assistance recipients

State wildlife agencies' representatives

Taxidermists and falconers

Technical training recipients

Trade organizations

Utilities' representatives

Visitors/Recreation

Volunteers (past, present, prospective)

Zoo, aquarium, and botanical garden stakeholders

There are 11 topic areas that the participating bureaus and offices are proposing to voluntarily obtain information from their customers and stakeholders. No one survey will cover all the topic areas; rather, this serves as a ``guideline menu'' from which the agencies would develop their questions. Example(s) of the types of questions that would be asked under each topic are provided. Under the proposed ICR, the agencies could use these specific questions or develop questions that fit within the generally understood confines of the topic area. Questions may be asked in languages other than English, i.e., Spanish, where appropriate.

The surveys could be designed using one of two generally accepted modes: (a) A statement for which the respondent uses a scaled answer (i.e., strongly agree, strongly disagree, not applicable, etc., based on a Lichert Scale) or (b) a question that asks for a specific response (i.e., yes/no, demographics, open-ended improvement question, etc.). For questions that use the Lichert scale and a preset list of options, the data will be reported in a numeric fashion, including average response and percent favorable. Open-ended questions will be subjected to a content analysis and be reported on accordingly.

  1. Communication/information/education:

    1. Providing consistent and timely information to the public.

    2. Where did you obtain your information about this site?

    3. Making it easy for people to find out about proposed changes.

      [[Page 63253]]

    4. Educating people about particular processes.

    5. Providing accurate, detailed and affordable maps and brochures.

    6. Providing useful web site, signs, publications, and exhibits.

    7. Charging an appropriate fee for the information/material provided.

    8. The information provided was effective and helpful.

    9. Providing quality web-based information.

    10. Engaging the public in the planning process.

  2. Disability accessibility:

    1. Do you or does someone in your party have a disability?

    2. If yes, how well does the agency make buildings, facilities, and trails accessible to people with disabilities?

    3. Accessibility to the programs and activities that address my needs.

  3. Facilities:

    1. Maintaining roads and trails.

    2. Maintaining a clean recreation site.

    3. Providing entrance/directional signs to sites and facilities.

    4. Providing a facility that is conducive to meeting specific user needs.

  4. Management practices:

    1. Responding to issues and problems in a timely manner.

    2. Providing access to a supervisor to resolve the problem.

    3. Understanding my needs.

    4. If you could make one improvement to XXX service, what would it be?

  5. Resource management:

    1. Providing reasonable access to resources.

    2. The extent to which the natural and cultural resources are protected.

    3. Getting public input when identifying critical areas for conservation.

    4. Preserving water resources and habitat for fish, wildlife, and plants.

  6. Rules, regulations, policies:

    1. Ensuring public awareness of rules and regulations.

    2. Ensuring fair and consistent policies for all users.

    3. The rules, regulations, and policies are clear and in plain language.

    4. Providing adequate protest and appeal policies to resolve issues and disputes.

    5. Adequately enforcing rules and regulations for all users.

  7. Service delivery:

    1. Providing a single point of contact.

    2. The staff I interacted with were courteous and friendly.

    3. The staff I interacted with were knowledgeable about the rules and regulations.

    4. The staff I interacted with were able to answer my questions about natural, historic, and cultural resources.

    5. The staff listened to and considered my ideas.

    6. The training I received provided the information I needed.

    7. The response was timely.

  8. Technical assistance:

    1. Provides unbiased scientific and technical support products and services.

    2. Reflects reasonable pricing.

    3. Quality of the execution of the analysis and interpretation.

    4. Considered alternative interpretations.

    5. Provides useful information.

  9. Program-specific: These questions will reflect the specific details of a program that pertain to their customer respondents. The questions will be developed to address very specific and/or technical issues related to the program. The questions will be geared toward gaining a better understanding about how to provide specific products and services as well as the priority the public would give to specific program objectives; they will not ask the respondents for their opinions about policies.

  10. Overall satisfaction:

    1. Everything considered, how would you rate your overall satisfaction with the delivery of XXX program or service?

    2. Values my relationship as a customer.

    3. I will contact or visit again for information or services.

    4. I trust XXX agency to do a good job performing XXX mission.

  11. General demographics:

    1. What is your zip code?

    2. How many times have you used this service in the previous 12 months?

    3. How many people are in your group?

    4. What activities did you participate in?

    5. As part of your recreation in this site/area, approximately how much money did you spend in the local community/area (e.g. lodging, equipment, food, fuel, maps/books, tours, guides)?

    6. What was your total household income (before taxes) in 2000 (less than $20,000; $20,000 to $39,999; $40,000 to $59,999; $60,000 to $79,999; $80,000 to $99,999; $100,000 to $119,999; $120,000 or more)?

    7. What is the highest level of education you have completed (some high school or less; high school graduate or GED; business school, trade school, or some college; college graduate; some graduate school; masters, Ph.D., or professional degree)?

    8. What is the primary language spoken at home? (i.e., English, Spanish)

    9. In what ethnic group would you place yourself (Hispanic/Latino or non-Hispanic/Latino)?

    10. In what race would you place yourself (American Indian, Eskimo, Aleut; Asian or Pacific Islander; Black or African America; White; Native Hawaiian)? Select one or more.

    Uses of Data: Chiefly, these data are being collected to improve the service and products that the participating bureaus and offices provide to the public. Managers and program specialists use these data to identify:

    Service needs of customers

    Strengths and weaknesses of services

    Ideas or suggestions for improvement of services from our customers

    Barriers to achieving customer service standards

    Changes to customer service standards

    Baselines to measure change in improving service delivery over time

    Improving public trust in government

    They also use this information to support all aspects of planning, from buildings, roads, and interpretive exhibits, to technical systems. In conducting their management, planning, and monitoring activities, managers also use the information to effectively allocate their limited personnel and financial resources to the highest priority elements.

    While the information will not be used for regulatory development, DOI anticipates that the information obtained could lead to reallocation of resources, revisions in certain agency processes and policies, and development of guidance related to the agency's customer services. Ultimately, these changes should result in improvement in services DOI provides to the public and, in turn, the public perception of DOI.

    In fulfilling the requirements of GPRA, DOI and all of its bureaus and offices have created a Strategic Plan in coordination with their respective publics. GPRA requires DOI to annually report on its progress toward achieving the goals outlined in the Annual Performance Plan. Some of the data collected may be used as the basis or in support of specific performance measures.

    Frequency: The frequency varies by survey.

    Estimated Number and Description of Respondents: See attached ``Table: Customer Types by Participating Bureau/Office'' for list of respondents. This table shows the likely groups that would be surveyed by each bureau and office but is not intended to limit the bureaus and offices to such groups.

    [[Page 63254]]

    Estimated Annual Reporting and Recordkeeping ``Hour'' Burden: We estimate that there are approximately 120,000 respondents submitting 120,000 responses (surveys and comment cards) annually. The average public reporting burden for a customer survey is estimated to be 15 minutes per respondent. For comment cards, the average public reporting burden is estimated to be 3 minutes per response. Given these estimates, DOI anticipates a budget of 18,000 hours per year for these proposed collections. We estimate, base on a $15 per hour valuation of volunteer time and the projected budget hours, an approximate aggregate cost to respondents of $270,000. Burden includes the total time, effort, or financial resources expended to generate, maintain, retain, or disclose or provide the information, including: (1) Reviewing instructions; (2) developing, acquiring, installing, and utilizing technology and systems for purposes of collecting, validating, verifying, processing, maintaining, disclosing, and providing information; (3) adjusting the existing ways to comply with any previously applicable instructions and requirements; (4) training personnel to respond to a collection of information; (5) searching data sources; (6) completing and reviewing the collection of information; and (7) transmitting or otherwise disclosing information. Please comment on the accuracy of our estimates and how DOI's bureaus and offices could minimize the burden of the collection information, including the use of automated techniques.

    Estimated Annual Reporting and Recordkeeping ``Non-Hour cost'' Burden: We have identified no ``non-hour costs'' burdens.

    Public Disclosure Statement: The PRA provides that a Federal agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB Control Number. Until OMB approves a collection of information, you are not obligated to respond.

    Comments: Section 3506(c)(2)(A) requires each agency ``* * * to provide notice * * * and otherwise consult with members of the public and affected agencies concerning each proposed collection of information * * *'' Agencies must specifically solicit comments to: (a) Evaluate whether the proposed collection of information is necessary for the agency to perform its duties, including whether the information is useful; (b) evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information; (c) enhance the quality, usefulness, and clarity of the information to be collected; and (d) minimize the burden on the respondents, including the use of automated collection techniques or other forms of information technology. Furthermore, we are interested in your comments regarding the need for and appropriateness of remuneration/incentives, or other suggestions you may have to increase response rates.

    To comply with the public consultation process, on August 8, 2001, we published a Federal Register Notice (66 FR 41600) announcing that we would submit this ICR to OMB for approval. The notice provided the required 60-day comment period. No public comments were received.

    If you wish to comment in response to this notice, send your comments directly to the offices listed under the ADDRESSES section of this notice. OMB has up to 60 days to approve or disapprove the information collection but may respond after 30 days. Therefore, to ensure maximum consideration, OMB should receive public comments by January 4, 2002.

    Public Comment Policy: Our practice is to make comments, including names and home addresses of respondents, available for public review during regular business hours. Individual respondents may request that we withhold their home address from the record, which we will honor to the extent allowable by law. There may be circumstances in which we would withhold from the record a respondent's identity, as allowable by law. If you, as a commenter, wish us to withhold your name and/or address, you must state this prominently as the beginning of your comment. However, we will not consider anonymous comments. We will make all submissions from organizations or businesses, and from individuals identifying themselves as representatives of organizations or businesses, available for public inspection in their entirety.

    DOI Information Collection Contact: Office of Planning and Performance Management (202) 208-1818.

    Dated: November 13, 2001. Norma J. Campbell, Director, Office of Planning and Performance Management.

    Table.--Customer Type by Participating Bureau/Office

    Insular Description

    BIA BLM BOR FWS Aff MMS OAPM OEP OSM USGS

    Authorized public land uses.......... ROW; Land Mgmt

    x ....... ....... ....... ....... ....... ....... ....... ....... ....... transactions, min. Coal operators.......................

    ....... ....... ....... ....... ....... ....... ....... ....... x ....... Contractors/vendors.................. Concessionaires........ x ....... ....... x ....... ....... x ....... ....... ....... Disabilities.........................

    ....... x ....... x ....... ....... ....... ....... ....... ....... Environmental groups.................

    ....... x ....... x ....... x ....... ....... ....... x Governments.......................... State, local, foreign.. x

    x

    x

    x ....... x ....... ....... ....... x Grant recipients.....................

    ....... ....... ....... ....... ....... ....... x ....... ....... ....... Indian Tribes/Alaskan villages.......

    x ....... ....... x ....... ....... ....... ....... x

    x Industry groups......................

    ....... x

    x

    x ....... x

    x ....... ....... x Insular governments..................

    ....... ....... ....... ....... x ....... ....... ....... ....... ....... Interested publics................... Community and specific- ....... ....... x

    x ....... x ....... ....... x

    x interest groups. Law Enforcement...................... Forensics, importers/ ....... ....... ....... x ....... ....... ....... ....... ....... ....... exporters. Mining companies.....................

    ....... ....... ....... ....... ....... ....... ....... ....... ....... x Public information centers...........

    ....... x ....... x ....... ....... ....... ....... ....... ....... Scientific data users................ GIS.................... ....... ....... ....... x ....... ....... ....... ....... ....... ....... State governments....................

    x

    x

    x

    x ....... x ....... ....... x

    x State wildlife agencies.............. State biologists....... x ....... ....... x ....... ....... ....... ....... ....... ....... Universities/Educators...............

    ....... ....... x

    x ....... x

    x

    x ....... .......

    [[Page 63255]]

    Utilities............................

    ....... ....... x ....... ....... ....... ....... ....... ....... ....... Visitors/Recreation.................. Visitors to federal ....... x

    x

    x ....... ....... ....... ....... ....... ....... land, bird watcher.

    [FR Doc. 01-30029Filed12-4-01; 8:45 am]

    BILLING CODE 4310-RK-P

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT