InfoToday 2002 marked the second year that E-Libraries has existed
as a conference in its own right, after many years as a smaller component
of the National Online Meeting known as IOLS (Integrated Online Library
Systems). Pamela Cibbarelli served as E-Libraries conference chair, and
as in past years her expertise brought diversity and timeliness to the
slate of presentations. The organizing/review committee wasmade up of Richard
Boss of Information Systems Consultant, Inc.; Marshall Breeding of Vanderbilt
University; and Sharon McKay of MARC Link.
As the name change from IOLS to E-Libraries indicates, the conference
this year was not limited to automation systems, but dealt more broadly
with technology and digital collections in libraries. It was broken down
into two concurrent tracks over the 3 days, covering library resources,
policies, serials issues, e-reference, portals, and databases.
Digital: Are We There Yet?
E-Libraries shared daily keynote sessions with the other two subconferences
under the InfoToday umbrella [see Paula J. Hane's report on p. 1], but
it also featured opening sessions of its own. Tuesday's kickoff session,
"Digital Libraries—The Next Generation," set the tone for the next 3 days,
providing an overview of where we've been and what to expect in the next
few years.
Clifford Lynch, executive director of the Coalition for Networked Information,
began by asking the often repeated, "Are we there yet?" Not yet, he said.
Library management systems have remained "surprisingly static" over the
past 10 to 15 years, and only now are we beginning to enter a period of
instability that indicates progress to truly digital systems. Starting
around 1985, the last time library systems underwent a significant paradigm
shift, functional requirements—linkages, digital resources, and graphical
interfaces—drove the change.
According to Lynch, we can expect functional requirements to shift again
in the coming decade, causing a significant change in library services
and systems. Among these are the following:
• Links between course management systems and library finding tools
in academic settings
• Portals to content and increased user personalization
• Tools to index licensed information (using standards such as the Open
Archives metadata project)
All of these functions will "have a huge impact on library collections,"
Lynch said, bringing unprecedented issues as to how and what items are
aggregated and who within a given organization is responsible for creating
and maintaining them. In summary, Lynch said that though we may not be
working in digital libraries yet, "big changes are coming."
Policies and Legal Issues
Of course, the proliferation of digital materials available in libraries,
particularly through the Internet, has raised difficult policy questions,
which libraries and the government have struggled to address in recent
years (though not always from the same angle). I attended three sessions
that dealt with the policy and legal issues inherent in managing digital
information.
The first, "Law & Disorder: Making Sense of CIPA," was presented
by George H. Pike, director of the Barco Law Library, assistant professor
of law at the University of Pittsburgh, and co-author of Information
Today's Legal Issues column. Pike explained the complexities of the
Children's Internet Protection Act, from what it really says to the options
libraries have to respond to it. (As of this writing, a three-judge panel
in district court has ruled CIPA unconstitutional. An appeal will now go
to the Supreme Court. See the NewsBreak at https://www.infotoday.com/newsbreaks/nb020610-2.htm.)
In Pike's opinion, the best way to deal with Internet obscenity is to
start by involving the community to establish library standards, as user
buy-in is key to establishing an effective policy. Because it's impossible
to block all material that's harmful to minors, you may want to get parent
signatures before allowing anyone under a specific age access to the Internet.
A major problem with CIPA, he said, is that there's no exception for
minors, and limiting access to certain sites across the board infringes
on adults' constitutionally protected right to information. Another option
that can cut down on users viewing obscene materials in the library is
to remove access to e-mail, chat, and Usenet groups. Remember, though,
that this is an "all-or-nothing proposition": Legally, you must remove
the information source, not just specific parts of its content.
Directly following Pike, Jesse M. Feder, an associate director at the
U.S. Copyright Office, laid out what he called the "troubling" recent happenings
in the digital copyright world in his session "Copyright Law and the Digital
Rights Agenda." From file-downloading services (such as Napster) to streaming
video and Webcasting, Feder summarized existing intellectual property and
performance rights issues and the several opposing viewpoints as to how
they should be protected in our age of easy, high-quality digital reproduction.
According to Feder (whose opening disclaimer was, "This is a view
from the copyright office, not the view ofthe copyright office"),
"Issues in the copyright world are as contentious as they ever have been."
The debate over digital copyright often pits copyright owners against consumers,
as evidenced by the proliferation of file sharing on the Web as legitimate
subscription services have foundered. Much of the current argument, Feder
said, deals with whether downloads are defined as being more like performances
or retail sales (as copyrights and royalties belong to different people
in each situation) and how much information subscription services have
to collect from users in order to get those royalties to the right people.
Though Congress attempted to address these legal, technological, and policy
issues with the Digital Millennium Copyright Act in 1998, according to
Feder its major success has been to unite the publishers and performers,
"because both oppose our recommendations."
On day two, addressing some of the issues Lynch had brought up in his
Tuesday kickoff session, Gail Dykstra spoke about digital rights management
in what she called "A Solution in Progress." DRM, she said, is made up
of "Four A's": authentication, authority, access, and accountability. Attention
to these basic components allows digital works to move smoothly and securely
from creators and publishers to retailers and consumers.
Though the traditional focus has been on what it does for publishers,
DRM has promise for users, too, she said. It has, for example, the potential
to "liberate" media from their original contexts to allow users to mix
and match content, and to create and deliver new outbound content. Dykstra
told us that, unfortunately, DRM today is more hype than practice, as it
requires a serious financial commitment and lacks technological standards.
However, we should expect changes in the coming year, as DRM begins to
come embedded in core software applications; pre-Tasinicopyright
issues are resolved; and products begin to deliver Web-based seamless links
between rights, value, use, and payment. She recommendedthat, in the meantime,
your library prepare by asking vendors about their DRM plans, keeping informed
through workshops and market "technology watch" articles, and involving
library administration in policy decisions at all levels.
Looking Ahead
The morning of day two featured a kickoff session on another copyright
issue: "The Debate on Scholarly Publishing." In this panel discussion,
Carol Tenopir, a professor of library science at the University of Tennessee,
and Michael Eisen, an assistant adjunct professor at the University of
CaliforniaBerkeley, went head to head over the feasibility of placing
the copyrights to scholarly works in the public domain, where they would
be free of the existing access fees that publishers charge. According to
Eisen, open access to scholarly works maximizes their value, benefiting
both users and researchers. The current economic model in publishing by
which revenues are made through collecting access fees to digital works
is "archaic," he said. Recognizing that publishing takes money, however,
Eisen suggested that publishers be paid in one upfront fee that's built
into research costs. After this, the copyright should be in the public
domain.
Tenopir countered that, while she agreesthe current system needs to
evolve, too much unilateral change may not be the best answer. Instead,
she proposed that we use technology to facilitate access alternatives based
on the nature of the work (its discipline, use, etc.). When a corporate
librarian in the audience asked, "Who bears the cost of publishing in the
private sector?" Eisen answered that in his view, the new economic model
should offer publishing services "a la carte," allowing researchers who
edit and prep their articles in-house to pay only for the service of publication.
Tenopir then asked the audience, "Does this make anyone else nervous? What
are the implications of this?" Judging by the number of hands in the air,
the session could have lasted far into the day but unfortunately had to
end unresolved.
Later on the second day, the two-part session "New Services, New Tools"
addressed some ways in which library tools are being brought more closely
in line with how people actually look for information. In "Emerging Generations
of Web Search Tools," Heting Chu of the Palmer School of Library and Information
Science at Long Island University explained the evolution of online searching.
Among the criteria she used to rate Web search tools were information covered,
retrieval approach (content based, etc.), and output (ranked, personalized,
etc.). We're currently in the second generation of Web search, Chu told
us, having moved from basic tools to those that provide multimedia, ranked
results, and that allow simultaneous search and browse. The third generation,
which is still in the making, will feature multimedia and text in the same
results sets, concept- or meaning-based indexing and searching, and personalized
output according to users' specifications.
In "Process for Developing E-Reference Services," Stephen Marvin of
West Chester University discussed how his library sees the Internet as
a tool, rather than as a competitor. Libraries have the advantage of being
both physical and online, he said, and so can be anywhere users need them.
Marvin stressed that his library promotes information fluency over information
literacy, as it is more detailed and requires more critical thinking. E-mail
reference, he said, does not change what services the library provides,
just where and how. "It goes where the users are, using what they don't
have—time and patience."
Day Three
The third day of the E-Libraries track provided a nuts-and-bolts view
of managing and developing electronic collections, building Web systems,
and the adoption and distribution of key library automation services. Attendees
chose between sessions in either of the two parallel conference tracks:
Web Portals or Database Creation.
Web Portals Track
The Web Portals track started with a morning session titled "Measuring
Resources and Measuring Performance." This two-part session focused on
the experiences of academic institutions in using Web-based solutions to
manage collections-updating processes and track acquisitions.
The Health Sciences Library at Stony Brook University (New York) turned
a labor-intensive, multi-part, largely manual updating process into a streamlined
database-driven Web updating procedure. Andrew White, Joseph Balsamo, and
Eric Djiva Kamal of the Health Sciences Library described how Stony Brook
created a "single entry point to update all records." The new system maintains
records of electronic and print acquisitions, including journal holdings,
license requirements, and permissions. It also generates the statistics
that the library uses to assess patron needs and make collection decisions.
Dennis Brunning started off the second session by joking that he wanted
to hire the Stony Brook developers for Arizona State University (ASU).
He described how ASU chose to build its tracking systems on "back-end vendor-supplied
statistics." He differentiated this approach from the front-end in-house
statistics-gathering system designed by Stony Brook. Kurt Murphy, also
of ASU, explained how the university turns this data into insights about
user preferences, patterns, requirements, and behaviors.
Next was "My Chicago Library," which presented the experience of designing
Web-based resources using customer preferences. The University of IllinoisChicago
used a focus group to inform the design process for the creation of customized
Web portals. Anne Armstrong, Courtney Greene, and Krystal Lewis described
the project as "a product developed by librarians for librarians, providing
a workable alternative to commercial Web portals ... providing a model
for similar collaboration throughout the larger library community."
"Building Flexibility and Accountability in Electronic Resources" was
presented by NASA's Langley Research Center. Librarians at Langley "made
the catalog the authoritative source for maintaining all electronic journal
information." The center developed catalog tagging and created and modified
open-source scripts to collect statistics and check Web links. Gerald Steeman
and Jane Wagner echoed the value of effectively employing Web-use statistics
gathered from vendor and in-house systems. They showed a side-by-side comparison
of vendor-supplied statistics from Elsevier (which display only the number
of sessions logged) vs. the rich detail of Langley's own statistics (which
identify the number of downloaded articles).
Database Creation Track
The Database Creation track featured Marshall Breeding of Vanderbilt
University. His workshop, "Constructing Web-Enabled Databases," covered
everything a motivated but inexperienced librarian would need to understand
the concepts, construction, and evaluation of Web-based databases. He is
a gifted communicator whomakes contact with listeners and explains the
"how and why" with ease, authority, and sensibility. Breeding's comprehensive
and authoritative Web site (http://staffweb.library.vanderbilt.edu/breeding/ltg.html)
indexes key resources on library automation. Its bibliography of automation
resources would be a good way to identify key articles and gather general
information on library vendors.
Sharon McKay's presentation, "Communicating for Excellent Retrospective
Conversions," provided a clear and detailed step-by-step guide for thinking
about and preparing for major library conversion projects. She led the
audience through a series of questions and answers to identify requirements
and diagnose problem areas. Her presentation would be the perfect guide
to creating a request for proposal for a MARC conversion.
Closing Keynote
Pamela Cibbarelli delivered the closing keynote speech, during which
she pulled together library automation statistics to give listeners a review
of all of the major vendors and products. Data for the presentation came
from Library Systems Newsletter.
She presented an "apples-to-apples" comparative profile analysis of
the key vendors that provided insight into the industry and trends, as
well as a "best fit analysis" for potential library clients. Trends identified
by Cibbarelli include the slowing down of the migration between library
automation systems. She also identified the sources of revenue for library
automation vendors: selling software (48 percent), maintenance (29 percent),
and sales from library automation hardware and other services (23 percent).
For each vendor covered, Cibbarelli provided an abbreviated, competitive
SWOT (Strengths, Weaknesses, Opportunities, andThreats) scorecard. Details
included key product developments (new versions of products, as well as
an identification of those companies with stagnant product lines); the
number of installed library sites, with geographic and regional distributions;
types of libraries served (academic, public, special); and the size of
the library subscribing to each library automation vendor.
When asked the inevitable question, "Which system is best?" Cibbarelli
reminded the audience that there is no single "best" system. Libraries
must identify the strengths and niches served by competing vendors and
sometimes by different products from the same company. Whichever product
meets the specific automation requirements of an individual library becomes
the "best product" for that library.
Elisabeth Winter is associate editor of Computers in Libraries.
Her e-mail address is bwinter@infotoday.com.
Gail Dykstra is a consultant in content business development and digital
rights management. Her e-mail address is gail.dykstra@dykstraresearch.com. |