Produced
by Key3Media Group, the Seybold San Francisco 2001 conference was held
from September 24-28th. It was billed as a conference where the publishing,
design, and media technology communities could learn about recent developments,
exchange ideas, hear experts, and preview the latest technologies. When
I registered to attend the conference in August, my intention was to find
out how the electronic media industry was faring, who had survived the
downturn, what the hot issues were, and what new technologies and products
would be announced that might improve the access or quality of electronic
information. I was amazed at how quickly the landscape had changed since
the previous fall and what a difference a year had made for the industry.
However, in the
case of the Seybold SF Conference, what a difference a few weeks made!
The terrorist attacks occurred just 2 weeks before the conference and had
a big impact on the program, attendance, and overall spirit of the event.
There was even a question of whether to postpone or cancel it altogether.
However, once Seybold decided to go on with the show, the organizers needed
to make program changes quickly, because a number of speakers were unable
or unwilling to travel that week. I was impressed by the quality of the
substitute speakers attracted at such short notice and how smoothly the
conference went under the circumstances.
Nevertheless, the
Sept. 11th attacks changed the tone and focus of the sessions and, basically,
overshadowed the whole event. The opening keynote of the Seybold Summit,
for example, was quickly changed from "The Next Big Thing" to a forum on
how well the media responded to the events in the first few days. Huge
postcards were provided outside the exhibit hall for attendees to write
messages to the people of New York. The attendees also served as reminders
that it was not business as usual. This was especially true for the publishing
industry, since the attacks had a profound financial, as well as physical
and psychological effect, on the media. As Patrick Henly from IDG Ventures
noted, the lost ad revenues and increased costs of covering the attacks
combined with the collapse of the "dot-coms" and downturn in the economy
to create a perfect storm convergence which is now threatening the industry.
Its demise may have been the Next Big Thing no one wanted to discuss.
Seybold SF ran
for a full week and was organized as four separate conferences along with
concurrent Hot Technology Days. A Cross-Media Publishing conference on
the first 2 days focused on publishing to multiple media with business
and technical tracks. On the third day, the Seybold Summit, featuring emerging
technologies and trends, was held. A Best Practices for Web Publishing
Conference on the fourth day offered design, developer, and corporate tracks,
while a Print Publishing Conference on the fifth and last day covered processes
and return on investment (ROI) issues. The Hot Technology Days covered
a range of technologies that represent the current and future direction
of the industry. Topics included full-day sessions on digital rights management
(DRM), PDF publishing, print-on-demand, wireless publishing, broadband
media, XML in publishing, e-books and e-content, content management systems,
digital asset management, multimedia design, and new color production tools.
With so many choices,
I tried to keep my focus on learning about trends, technologies, and major
issues that would affect the access and quality of information available
to researchers. In addition to attending opening and lunchtime keynote
addresses on "The Future of Content in a Cross-Media World" and "Media-Independent
Publishing," I spent much of the first day in Hot Technology sessions on
Digital Rights Management (DRM). There I discovered that the industry has
not come much closer to solving issues that hamper the development of e-books
or of convincing the market of their value proposition. Later I attended
an evening keynote entitled "Who Pays, How Much, and What For?," which
raised important questions about access and control of information, as
well as who should pay for content now that the party is over and venture
capital and ad revenues have dried up.
The following day,
I heard Steve Jobs give the most exciting of the Seybold keynotes in describing
a host of new features available on Apple's first upgrade to the Mac OS
X (Version 10.1). The upgrade increases the speed and performance of the
Mac significantly and integrates professional-level DVD, graphic, and developer
tools into the operating system. In demonstrating the scripting tools,
for example, an Apple representative showed how an Illustrator file linked
to the Internet by AppleScript could turn a static drawing into a dynamic,
color-coded map driven by data from the Web. It was exciting enough to
send me to the exhibit hall to see more at the Apple and Adobe booths,
as well as to check out other content management and DRM vendors.
After hearing more
technical details than I could absorb in the "Future of XML," I needed
a break. It was enough for me to understand that XML's acceptance as a
standard among publishers, notwithstanding the squabbling about versions,
seemed to have already made it a basic component of current digital asset
and content management solutions in development. It was reassuring to hear
that so many products now exist to help companies convert their content
to XML formats.
Personally, I found
the Seybold Summit Day one of the most interesting. It opened with a thought-provoking
discussion on how Internet publishers and their technologies performed
in covering the terrorist attacks, as well as the Web's role with respect
to the traditional media in reporting critical news events. This was followed
in the afternoon by heated arguments and rowdy audience participation during
the "Steal This Session" panel discussion on the effect of the Digital
Millennium Copyright Act (DMCA) on the rights of users versus publishers.
However, I found
the lunch keynote on rights management by Jeff Ramos, director of worldwide
marketing for Microsoft, disappointing in comparison to Apple and Adobe's
announcements on the previous day. Microsoft's big announcements were that
the PocketPC 2002 will fully support DRM and that their Reader software
would now allow the downloading of e-books from the Web to a wireless device.
It would also permit the transfer of rights-protected content to up to
four separate devices. He also made a pitch for corporate use of DRM to
control access to internal documents.
Like the next session
on "Key Emerging Technologies," which failed to reveal any exciting new
breakthroughs on the horizon relevant to the needs of professional searchers,
Microsoft's announcements were striking for the dearth of development represented
and underscored for me the lack of investment in content initiatives this
year. The day ended with a panel discussion entitled "The Next 12 Months"
which confirmed that assumption and presented such a bleak picture that
I skipped the evening fund-raiser and went home depressed.
The next morning's
keynote speaker, Donald Van de Mark, co-founder of Myprimetime Inc. and
former CNN anchor, seemed to have anticipated my malaise from the day before
and gave pointers for "Surviving and Thriving in the New New Economy."
However, I had reached overload and wanted to reflect on what I had heard
over the past 3 days, so I decided to forego the rest of the best practices
publishing sessions after attending an uninspiring one on selecting content.
However, I must admit that I was somewhat tempted by the topics of the
lunch keynotes on the last 2 days, respectively entitled "Why the Web Still
Sucks" and "Learning from Pornography."
The Cross-Media Publishing
Conference
In the spirit
of the times, the primary message underlying the Cross-Media Publishing
Conference was that the future of content and the survival of the electronic
publishing industry will be determined by the ability of publishers to
cut costs and develop realistic business models.
Cutting Costs
and Expanding Distribution
When publishers
first established a presence on the Web in the mid-'90s, many created separate
new media departments that created and managed Web content apart from their
traditional print workflow. This created duplicate expenses and redundancies
that can no longer be afforded, according to Juergen Kurz, vice-president
of product management for Quark, Inc. It also limited the ability to leverage
the content by repurposing it for broader distribution. John Van Siclen,
vice-president and chief operating officer of Interwoven, tracked the media
evolution from print to video to Web and noted that the lack of coordination
between the three assumed a unique audience for each medium. His message
was that the same users now access content via many delivery platforms
(i.e., the Web, cell phone, PDA, voice systems, print, etc.), and that
publishers need a consistent presence on all of them. The medium itself
was less relevant to users than the content, he contended; and to meet
this demand, publishers need to deploy content management systems that
facilitate media independent publishing. This type of publishing allows
content to be created once and delivered in many forms for use on multiple
devices.
Another benefitof a cross-media package for both the publisher and the user is the ability
to do print-on-demand, network publishing. Stephanie Acker-Moy, general
manager of HP.com and enterprise publishing for Hewlett-Packard, noted
that 60 percent of books and magazines are not sold, but still incur print
and storage costs. In contrast, content created once and maintained in
a digital format can be printed where and when needed, as well as tailored
to the customer's experience and needs.
Essentially, publishers
were advised to deploy workflow systems that streamlined the production
process by managing content consistently through the writing, editing,
approval, conversion, testing, and posting stages and to use XML tags to
repurpose content. I found it interesting that in advocating the use of
structured, standardized data formats and metadata, Web publishers were
finally adopting practices advocated by information professionals and database
publishers for many years. Another concept advocated by Web publishers
and experts at Seybold that would hardly startle information professionals
or database publishers was the notion of charging for content.
Who Pays?
When Michael O'Donnell
revealed in the morning keynote discussion that Salon.com will reposition
itself as a daily newspaper rather than a magazine, he noted that Salon's
strategy will be "to start accustoming the user to pay for content on the
Web," because the banner ads and pop-ups now in use on the Web are not
sufficient to support the cost of content creation. In the evening keynote
discussion, "Who Pays: How Much, and What For," Mark Anderson of
the Strategic News Service noted that in the past only rare and valuable
content could be sold on the Web (a generous description if he was referring
to traditional online services), but now that click-throughs have decreased
to 0.1 percent, ads are failing to subsidize content. "'Information wants
to be free,'" he said, "but people want to get paid."
However, although
the panelists agreed that subscriptions were the best strategy to keep
Web publishers in business, they still felt that the public good required
the availability of some free information. In fact, Daniel Kohn, a general
partner with Skymoon Ventures, thought there was no good way of getting
paid for content once it went on the Web, because all DRM rules are breakable
and as soon as someone gets access, they can copy and redistribute the
content indefinitely without degrading the original. He relegated content
to a nonrival and nonexcludable category and said it should be treated
as a pure public good. However, he did make an exception for movie theaters
or other delivery methods that can control physical access to content and
for online subscription services that can reauthenticate users every time
they access a site. His solution to the public need for information was
to subsidize content with the sale of other goods or to have micro-patrons
or nonprofit foundations pick up the tab. The government might pay for
it, too, but he asked, "Who wants Jessie Helms to pick what music you listen
to?"
Denise Caruso,
executive director of the Hybrid Vigor Institute and a former New York
Times columnist, concurred with the need for free public information,
noting that information has to be able to flow freely to be free. She contended
that the recent copyright and patent laws have upset the balance between
the market and the public domain, which she described as a cultural space
in which we share information, creativity, and ideas. Like an ecosystem,
she felt there must be a balance between the needs of the public and the
content owners. She presented the novel idea that publishers should charge
for non-essential information. "Let people who want pictures of Britney
Spears pay for them," she argued, and use the proceeds and ad money to
subsidize serious content as a public service. "Think about the last 2
weeks of your life," she asked the audience. "Do you want a toll gate every
time you click?"
Caruso also supported
the idea of using government subsidies and philanthrophy to pay for information
that fosters artistic creativity, academic research, technological innovation
and political or cultural ideas. She gave examples of non-profits that
are creating public information trusts such as the Center for Public Domain's
ibiblio site [www.biblio.org],
a growing archive of public domain knowledge developed in collaboration
with the University of North Carolina at Chapel Hill; and the "Open Directory
Project" [www.DMOZ.org], a human-edited
directory of the Web, constructed and maintained by a community of
volunteer editors. She also commended the work being done by librarians
and scholars to regain control of scholarly communication from commercial
publishers [www.createchange.org]
and supported the concept of a public library of unrestricted, peer-reviewed
scientific journals that would be paid for by government grants to authors
to offset publication costs."
The Seybold Summit
A Defining
Moment
The topic of the
opening keynote for the Seybold Summit changed the night before to address
an issue of more immediate relevance to the speakers and attendees: how
well Web publishers and their technologies performed in covering the terrorist
attacks of September 11th. The panelists included editors of Weblogs and
personal Web sites that posted news links, personal accounts, and pictures
from users in real time during the crisis, as well as representatives from
the main-steam media. Neil Chase of CBS Marketwatch and Bruce Koon, an
executive news editor for Knight-Ridder, joined Dave Winer of Scripting
News [www.scripting.com]
and Davenet, Doc Searls of Doc Searls Weblog [www.doc.weblogs.com],
Jason Kotke, editor of Kotke.org [www.kotke.org],
and Rick Smollen, a professional freelance photographer (Life Books, etc.)
to analyze how well the system worked. Generally the panelists agreed that
the business of journalism went away during that time, trivia and ads disappeared,
and both professional and amateur journalists focused on getting out the
story. They defined it as the media's finest hour, performing in the public
interest despite financial losses.
The role that emerged
for the Web during the crisis was to act as an open and broad source of
information and opinion and to address the public's need for real-time
information. On the Web, anyone could bear witness and share information.
Dave Winer described how he had received photos in real time from a Webcam
on the Empire State Building and noted that Weblogs got breaking news out
faster than the big media sites, which typically needed 24 hours to verify
the accuracy of stories. Both Chase and Koon verified that they scan the
Weblogs for news and that, in a number of foreign countries, the Web had
replaced news bureaus and stringers as a source for international information.
However, questions
about the quality of the Weblog as a source of information were raised,
since Weblogs do not check facts and often post unverified rumors and speculation.
In response, Searls pointed out the self-correcting aspect of community-based
Web journalism: "We're constantly checking each others stuff." In fact,
the interactive nature of the Web was seen as providing an important counterbalance
to the traditional media, as it allows the public to challenge assumptions
and overcome censorship and biases, as well as helping to correct errors
that often occur in big breaking stories.
Essentially, a
picture emerged of a symbiotic relationship between the Weblogs and the
traditional news media. The Webloggers would select and post unverified
information they received from the public in real time during the crisis
and the traditional media would check it for accuracy before they published
it. Likewise, published information from the traditional media was subjected
to commentary and correction by the public on the Web. The Web was also
seen as a way to humanize and personalize broadcast media and to filter
different stories to the top. Dave Winer gave the example of a letter from
an Afghan-American writer that Salon.com posted on its site. The writer
explained the futility of bombing Afghanistan and questioned whether it
might not play into Osama bin Laden's plan to start a holy war. Peter Jennings
saw the letter on Salon.com and interviewed the writer on his program,
giving national exposure to an opinion that might not otherwise have been
articulated in the mainstream media.
The question was
posed as to how the events would have been handled 10 years ago and the
panelists agreed that today's technology made a difference. One panelist
noted the importance of the cell phone on the fourth plane and Koon said
Knight Ridder's reliance on XML was critical in quickly normalizing feeds
coming in from 30 different sources. However, all sites were overwhelmed
by the traffic. It was seen as a new media's trial by fire and the panelists
thought the terrorist attacks could become the defining moment for Web
news, much like the Gulf War was for TV news.
Circumventing
Copyright
"Steal
This Session" turned out to be a lively panel discussion on the
ramifications of the Digital Millennium Copyright Act (DMCA). The panelists
raised issues and defined the arguments, but never reached a consensus.
Moderated by Mark Walter of Seybold, the session pitted Allan Adler of
the American Association of Publishers against Robin Gross, an attorney
for the Electronic Frontier, and Jim Griffin, CEO of Cherry Lane Digital.
Kurt Foss of Planet eBook and Bill Rosenblatt of Giant Step Media balanced
the group with more objective views and observations.
The discussion
focused primarily on a provision of the DMCA that makes it illegal to produce
or sell tools that break encryption protecting digital works against piracy.
The discussion was personalized by the case and presence in the audience
of Dmitry Sklyarov, a Russian programmer who made a device that could bypass
security on Adobe software and was detained in the U.S. for speaking about
it at a conference, even though he had never actually decrypted the product.
His case was still not resolved and many of the participants expected the
case to ultimately end up in the Supreme Court.
Adler presented
the background and rationale for the DMCA provisions from the publishers'
perspective. He noted that in December 1996, 96 nations adopted the WIPO
Copyright Treaty to deal with issues specifically related to digital media,
such as ease of copying, distribution, performance, and display. He claimed
that the basis for the DMCA was the need the U.S. faced to upgrade its
copyright law in order to comply with the treaty. Griffin disagreed and
identified the emergence of streaming technology as a key factor because
it had led Webcasters to ask for the same reuse rights as radio stations,
i.e., blanket, compulsory licenses. He believed that the publishing industry
wanted to control how much Webcasters would have to pay for those rights,
so they lobbied Congress to revise the copyright laws. Robin Gross agreed
that DMCA gave more anti-circumvention rights to publishers than necessary
simply to comply with WIPO and thought that it tipped too far toward copyright
holders in making it illegal to not only decrypt files, but also to own
or develop encryption technology. She also felt the effect was to virtually
eliminate fair use altogether. Since fair use can't be prescriptively defined,
she argued, it can't be machine engineered, and decryption is often the
only way to exercise the right.
Rosenblatt believed
that there were benefits to reverse engineering systems and that vendors
of encryption algorithms should give people permission to "bang on it"
in order to fully test their effectiveness. If results can't be published,
as is now the case, nothing will be learned. However, Adler pointed out
that there are issues of vulnerability, security, and even irresponsibility
if ways to thwart anti-virus protections, for example, are published. While
he agreed that in some cases it might be appropriate to "bang," he felt
it was wrong to publish or sell tools that strip real products of protection
and drew an analogy to the illegality of possessing burglary tools.
Contention also
surrounded the issue of assigning liability. Gross believed that actual
infringers rather than builders of tools should be liable, while Adler
held that since infringers disappear, the industry needs to go after producers
and traffickers in anti-circumvention tools. When it was pointed out that
the tools that break Adobe E-book's encryption are only useful to someone
who has already purchased a book and wants to use it in a new way, Adler
responded that then it became a marketplace issue. If the technology is
overly restrictive and so cumbersome that it affects sales, it will force
changes in licensing restrictions. Griffin shot back that the industry
should "attack the motive, not the mechanism." He gave the example of the
video industry destroying the motive to copy videos by making them so cheap
and available that it is not worth the trouble. As the audience got involved,
the discussion got even more contentious. A good time was had by all.
The Immediate
Future
The session entitled
"The Next 12 Months" might have originally intended to feature emerging
trends and technologies. However, in the current economy, it shifted to
an assessment of the economic health of the media market by representatives
from advertising, newspaper, and venture capital firms. The first speaker,
Eve Asbury, senior vice-president and director of print and digital production
for Saatchi & Saatchi, confirmed that Web publishers should not rely
on ad revenues. Dressed entirely in black with exceedingly mournful eyes,
she explained that she was a New Yorker and still too sad to focus on business.
It was her opinion that the ad business would change because of the tragedy
and would become more serious now that we have gone into a war-time economy.
Her assessment was that we would not be out of trouble in the next 12 months.
She thought 2001 would be tough, 2002 slow, and finally, in 2003, that
the market would come back. She noted that tech sector ads had collapsed
entirely and that other Web requests were minimal because they can't prove
any efficiencies to clients.
Dave Cole from
NewsInc. said that the newspaper industry was actually making money a year
ago on classifieds, nationals, and local retail, despite declines in circulation.
However, a drop-off in November 2000 was noticed in employment and housing
classifieds, which has steadily declined ever since. Essentially, newspapers
face the challenge of achieving a balance between their roles as businesses
and as public trusts. The bottom fell out of the ad business after Sept.
11th. At the same time, newspapers published special editions solely for
the public good and publishers such as the Wall St. Journal lost
41 percent of their income. In the next 12 months, he expects no airline
ads, few vacation ads, and few blockbuster movie ads. He also noted that
the newspaper print business has been losing business to new online sources
and sites, such as eBay, that compete for classifieds. With cross-ownership
rules being lifted, survival of newspapers may depend on their ability
to leverage content across different media.
Patrick Henly from
IDG Ventures expected that it would take at least 1 year to 18 months before
markets come back and noted that the shake out in the industry is not over.
Currently, most venture capitalists are doing triage to decide which middle-stage
companies should be granted another year of life, so another 25-30 percent
of the market could still shut down. The good news is that there is still
a lot of money for startups because most VCs pulled money out in the first
quarter of 2001, but the bad news is that no one is investing in content
deals, only technology that saves production costs or improves users' experience.
In addition, most long-term investments are being deferred.
For Patrick, theIndustry
Standard's demise signaled the end of party. It was a fine magazine,
had a good business model, and covered a legitimate industry. However,
like the businesses it covered, its expectations were out of whack. He
thought their mistake was to go public when they only owned only one magazine,
and in so doing, ceding control to financial types. Otherwise, they might
have hunkered down and weathered the storm. His advice was to stay private
if you want to survive in the magazine business. Patrick described the
current cycle as the worst in memory and getting worse. We bounced back
from previous recessions, he said, but each bounce has been less healthy.
On that note, I crawled out of the room and headed home, worried as much
about the future of information as my own employment prospects.
A Long-Range Forecast
Despite the upbeat
titles of keynotes and sessions planned during better times, the atmosphere
at the Seybold Conference was decidedly downbeat. Everyone was still shell-shocked
by the events 2 weeks before, not to mention the dismal condition of the
industry over the past year. The programs and exhibits offered constant
reminders of layoffs, failed companies, and discontinued products plaguing
the industry. Even the companies still in business may face collapse when
their last round of funding is depleted.
It was pretty clear
to me that the phenomenal growth of content on the Web that we have enjoyed
over the past 5 years is over and that we may face the death of free content
— at least from commercial sources — because of market forces and legal
decisions. Web publishers are struggling for economic survival. With funding
and ad revenues drying up, those who survive will need to find other means
to pay for content. This will require robust DRM solutions that can support
e-commerce, syndication, distribution, and usage metering effectively.
However, according to Bill Rosenblatt of Giant Step Media, DRM vendors
alone had six major layoffs, two bankruptcies, and three discontinued product
linesthis year. Companies still in the DRM business have yet to solve critical
problems affecting market acceptance.
We are also losing
existing digital content. Maureen Dorney, an intellectual property lawyer,
confirmed that tracking rights after the Tasini ruling represent
a huge problem for publishers. If they can't negotiate retrospective rights
from authors for older content, it will be removed from electronic distribution.
So what hope is
there? On one hand, the industry is clearly fighting for its life, brought
down by a convergence of bad investments, negative market forces, and unanticipated
political events. On the other hand, the performance of the media during
the crisis and the interrelationship that has developed between the public,
the media, and the Web are both critical and vibrant, as is the growth
of public information collections on the Web. It is also encouraging that
the media is learning lessons about the need for production efficiencies
as well as the importance of indexing and standards. If XML is adopted
and consistently deployed by publishers as the de facto format to create
content, it could ultimately lead to better access to information on the
Web, easier integration and segmentation of content, and wider distribution
of content across media. Also the upside of the market's rejection and
circumvention of DRM solutions is that it may force vendors to develop
methods that protect rights without stifling legitimate use of the information.
We may end up with far fewer content publishers, but these publishers might
also produce better, more accessible information when, where, and how weneed it. However, if these publishers don't act quickly to stem losses
and current trends continue, we could face one of the scenarios imagined
at the SCOUG retreat this past summer, one where we enter a dark age of
information with good content simply not being produced or made available
to the public.
|