|
Your Tax Dollars at Work The Internet should serve as the U.S. government’s primary archive |
So what was NTIS’s reward for all this yeoman-like effort? It has spent the last year or so fighting for its life as the executive branch hierarchy considers NTIS’s failure to meet what it apparently regards as its prime directive: cost recovery. True, time and tide have reduced the role of federal funding in the nation’s scientific and technical development from the days of the Cold War spending boom. But does that not make full and forceful dissemination of the material we do develop all the more important? Blithely, the eager executioners point to our friend the World Wide Web as a simple solution. And so it may be. But not without effort, organization, commitment, and resources.
But from the lemon comes the lemonade. It would take legislation to destroy NTIS. The problem now lies in Congress’ hands. Responding to pleas and a swirl of controversy over the Commerce Department’s death sentence, Congress looked at the case and decided to defer final judgment until it had a better grasp of the bigger picture. In fact, it has asked the question so many of us have wanted it to ask for some time. No, not “What information policy should the federal government set for the nation?” (though that’s certainly an interesting question). Rather, the one that asks, “What information policy should the federal government set for itself?” Before one tries to be Martha Stewart and tell everyone else how to live, one should first clean one’s own house.
In June, Senator John McCain asked the National Commission on Libraries and Information Science to produce a report for the Senate Committee on Commerce, Science, and Transportation that would serve to review:
[T]he reforms necessary for the federal government’s information dissemination practices. At a minimum, this review should include assessments of the need for: proposing new or revised laws, rules, regulations, missions, and policies; modernizing organization structures and functions so as to reflect greater emphasis on electronic information planning, management, and control capabilities, and the need to consolidate, streamline, and simplify missions and functions to avoid or minimize unnecessary overlap and duplication; revoking NTIS’s self-sufficiency requirement; and strengthening other key components of the overall federal information dissemination infrastructure.Hallelujah! They’re finally looking in the right direction. They’ve realized that the information revolution that surrounds us all requires breadth of view before detail work. One cannot just muddle through any more. We need a general policy to take advantage of all the potential benefits and avoid the most dangerous pitfalls.
What does the federal government need to do? First, it must adopt a firm worldview and vision of where information is now and where it’s going. It should recognize and commit to the realization that the Internet has initiated a platform that will, in time, provide universal access to all kinds of information for everyone, including all government agencies and their taxpaying constituencies. The Web represents the final solution for disseminating information of all kinds to all people everywhere. With the accelerating expansion of broadband technologies, soon every home and office can expect high-speed, “always-on” Web access. A recent study by Media Metrix indicated the fastest-growing group of new Internet users is low-income households. Broadband will, in a shorter time than anyone predicted, make the Web a reality for every constituent.
When it comes to issues of dissemination, federal policy should simply require all government agencies to publish all the information they intend to release to the public over the Net. The Net should be the federal government’s primary publishing mechanism, with print or microform or any other format considered as secondary. Specific issues of what to release and to whom will always vary from agency to agency and situation to situation, but the rule of producing Net-compatible documentation should stand firm for all public data. Access issues simply mean that any federal Internet/intranet/Web platforms should include control capabilities such as password requirements, timed releases of documents, payment mechanisms, etc.
Federal information policy should make the facilitation of access a general principle of Internet/Web platform development. Whatever current technological features promote greater access (e.g., text-to-speech for the visually impaired, compatibility with all standard browser interfaces, alternative document formats, etc.) should be supported now. Whatever future technological developments come along to promote greater access (e.g., automatic translation, voice commands, wireless delivery, etc.) should be supported whenever they arrive.
Though individual agencies would continue to handle their own information-dissemination responsibilities, a general oversight agency should ensure a certain uniform standard of performance. Perhaps this could become a new role for a Web-oriented NTIS or for the U.S. Government Printing Office (not!), or even the National Archives and Records Administration (NARA). Or it might require the creation of a new agency, perhaps the Federal Web Archive. But the people’s right to see the product of their tax dollars at work and to take advantage of that product to serve their personal, professional, and civic responsibilities should not rise or fall on the exigencies of one agency’s day-to-day operational crises. The central office should provide coordination, set standards, review performance, supplement the needs of smaller agencies with specific assistance, and provide a central access point from which all government agency Web offerings would proceed, as well as link to archived access.
And, following the NTIS tradition, federal information policy should encompass knowledge developed on its behalf and with its money as its own, or darn close. If the feds have paid for research, then we all have a right to see it. If the National Institutes of Health (NIH) fund a quarter of the world’s quality medical research, as has been claimed, then the NIH has a perfectly legitimate right to take an active role in ensuring that the research it funds be disseminated swiftly and safely to those who need it. In cases where government agencies have a mission to serve specific needs of specific constituencies, they may even have a duty to expand beyond (strictly) federally funded data. Private sector property should be preserved and protected, but private sector interests must be considered within the context of the public interest.
If the federal government adopted such a policy and implemented it successfully, it would set a model that the whole Internet could follow. What do information professionals everywhere bemoan about today’s Web? The lack of permanence. The lack of reliable ways to make sure that people can reach the good stuff. Federal data may have its problems, but it definitely qualifies as “the good stuff.” By Web standards, it qualifies as Nobel-quality good stuff!
If the feds could establish a permanent, complete archive of all federal
documentation and then add a structure that allows agencies and people
to work the taxonomies underlying the archive, find their material, and
link to it, we would have created a working model that others could follow.
Distance learning institutions could join together and build a virtual
university library. Multinational corporations could convert the flow of
company operational data into electronic company ar
chives. Extranets and Internet connections to in-house data could create
new revenue sources for some.
In the course of building its own Web archive, the federal government would inevitably have to work through the problems of standards, coordination, and implementation. The epic construction effort would provide a broad range of practical experience that others wrestling with Net archive/access problems could use. In turn, the federal contribution would keep the feds in the flow as outside institutions come up with their own improvements on the scheme.
In particular, this effort should help with changing the course of scholarly communication and publication. Already the National Institutes of Health (PubMedCentral) and the Department of Energy (Preprint Network, PubScience) have taken a leadership role in moving scholarly exchange to a Web environment. The private sector has responded by initiating efforts that seemed stalled before government action occurred—e.g., the new Publishers International Linking Association’s (PILA) CrossRef program, which followed the introduction of E-Biomed (the earlier name of PubMedCentral).
But basically, the federal government needs to move to the Web, big time, to ensure the performance of its mission of service to the people of the U.S. The Web offers a once-in-a-millennium opportunity: the chance to get in on something that’s better, faster, and cheaper. As we all know, you’re usually lucky if you get two out of three.
And, as one more benefit, this initiative would also help sustain the U.S.’s world leadership in the growth and development of the Internet and the Web. It could set a model to governments everywhere on how to open their documents to their citizens, while protecting essential government security concerns. The wide circulation of information promotes globalization and an improved world economy. The Net currently exists in a lawless world—and many want it to stay that way—but even societies without laws have their mores and social rules. Beyond Netiquette, the U.S. government could lead by example and sneak up on a larger federal information policy.
Whoops, here comes that world-conquest monster again.
Barbara Quint, co-editor with Paula J. Hane for NewsBreaks,
is editor in chief of Searcher, a columnist for Information Today,
and a longtime online searcher. Her e-mail address is bquint@mindspring.com.
Table of Contents | Information Today Home Page |