When is a Book Not a Book?

I recently wrote a short Gilbane Spotlight article for the EMC XML community site about the state of Iowa going paperless (article can be found here) in regards to its Administrative Code publication. It got me to thinking, "When is a book no longer a book?"

Originally the admin code was produced as a 10,000 page loose-leaf publication service containing all the regulations of the state. For the last 10 years it has also appeared on the Web as PDFs of pages, and more recently, independent data chunks in HTML. And now they have discontinued the commercial printing of the loose-leaf version and only rely on the electronic versions to inform the public. They still produce PDF pages that resemble the printed volumes that are intended for local printing of select sections by public users of the information. But the electronic HTML version is being enhanced to improve reusability of the content, present it in alternative forms and integrated with related materials, etc. Think mashups and improved search capabilities. The content is managed in an XML-based Single Source Publishing system that produces all output forms.

I have migrated many, many printed publications to XML SSP platforms. Most follow the same evolutionary path regarding how the information is delivered to consumers. First they are printed. Then a second electronic copy is produced simultaneously with the print using separate production processes. Then the data is organized in a single database and reformatted to allow editing that can produce both print and electronic. Eventually the data gets enhanced and possibly broken into chunks to better enable reusing the content, but the print is still a viable output format. Later, the print is discontinued as the subscription list falls and the print product is no longer feasible. Or the electronic version is so much better, that people stop buying the print version.
So back to the original question, is it no longer a book? Is it when you stop printing pages? Or when you stop producing the content in page-oriented PDFs? Or does it have to do with how you manage and store the information?

Other changes take place in how the information is edited, formatted, and stored that might influence the answer to the question. For instance, if the content is still managed as a series of flat files, like chapters, and assembled for print, it seems to me that it is still a book, especially if it still contains content that is very book oriented, like tables of contents and other front matter, indexes, and even page numbers. Eventually, the content may be reorganized as logical chunks stored in a database, extracted for one or more output formats and organized appropriately for each delivery version, as in SSP systems. Print artifacts like TOCs may be completely generated and not stored as persistent objects, or they can be created and managed as build lists or maps (like with DITA). As long as one version is still book-like, IMHO it is still a book.

I would posit that once the printed versions are discontinued, and all electronic versions no longer contain print-specific artifacts, then maybe this is no longer a book, but simply content.

Random House: Creating a 21st Century Publishing Framework

As part of our new report, Digital Platforms and Technologies for Publishers: Implementations Beyond "eBook," we researched and wrote a number of case studies about how major publishing companies are moving to digital publishing. The following is case study of Random House and its use of Digital Asset Management (DAM) technology from OpenText to create a much more dynamic and agile publishing process.


Random House, Inc. is the world’s largest English-language general trade book publisher. It is a division of Bertelsmann AG, one of the foremost media companies in the world.

Random House, Inc. assumed its current form with its acquisition by Bertelsmann in 1998, which brought together the imprints of the former Random House, Inc. with those of the former Bantam Doubleday Dell. Random House, Inc.’s publishing groups include the Bantam Dell Publishing Group, the Crown Publishing Group, the Doubleday Broadway Publishing Group, the Knopf Publishing Group, the Random House Audio Publishing Group, the Random House Publishing Group, and Random House Ventures.

Together, these groups and their imprints publish fiction and nonfiction, both original and reprints, by some of the foremost and most popular writers of our time. They appear in a full range of formats—including hardcover, trade paperback, mass market paperback, audio, electronic, and digital, for the widest possible readership from adults to young adults and children.

The reach of Random House, Inc. is global, with subsidiaries and affiliated companies in Canada, the United Kingdom, Australia, New Zealand, and South Africa. Through Random House International, the books published by the imprints of Random House, Inc. are sold in virtually every country in the world.

Random House has long been committed to publishing the best literature by writers both in the United States and abroad. In addition to the company’s commercial success, books published by Random House, Inc. have won more major awards than those published by any other company—including the Nobel Prize, the Pulitzer Prize, the National Book Award, and the National Book Critics Circle Award.

Bennett Cerf and Donald Klopfer founded the company in 1925, after purchasing The Modern Library—reprints of classic works of literature—from publisher Horace Liveright. Two years later, in 1927, they decided to broaden the company’s publishing activities, and the Random House colophon made its debut.

Random House first made international news by successfully defending in court the U.S. publication of James Joyce’s masterpiece, Ulysses, setting a major legal precedent for freedom of speech. Beginning in the 1930s, the company moved into publishing for children, and over the years has become a leader in the field. Random House entered reference publishing in 1947 with the highly successful American College Dictionary, which was followed in 1966 by the equally successful unabridged Random House Dictionary of the English Language. It continues to publish numerous reference works, including the Random House Webster’s College Dictionary.

In 1960, Random House acquired the distinguished American publishing house of Alfred A. Knopf, Inc., and, a year later, Pantheon Books, which had been established in New York by European editors to publish works from abroad. Both were assured complete editorial independence—a policy which continues in all parts of the company to this day.

The Open Text Digital Media Group, formerly Artesia, is a leader in enterprise and hosted digital asset management (DAM) solutions, bringing a depth of experience around rich media workflows and capabilities. Open Text media management is the choice of leading companies such as Time, General Motors, Discovery Communications, Paramount, HBO and many more.

When clients work with the Open Text Digital Media Group, they tap into a wealth of experience and the immeasurable value of:

  • A decade of designing, delivering, and implementing award-winning rich media solutions
  • A global client base of marquee customer installations
  • An experienced professional services staff with hundreds of successful implementations
  • A proven DAM implementation methodology
  • Endorsements by leading technology and implementation partners
  • Domain expertise and knowledge across a variety of industries and sectors
  • The global presence and financial strength of Open Text, a leading provider of Enterprise Content Management solutions with a track record of financial growth and stability

Read More →

Digital Publishing Visionary Profile: Lulu’s Bob Young

As part of our new report, Digital Platforms and Technologies for Publishers: Implementations Beyond "eBook," we interviewed a number of industry visionaries. The following is a summary of a discussion between Lulu’s Bob Young and Gilbane’s Steve Paxhia.

Bob Young: Lulu—Next Steps
Bob Young is the founder and CEO of Lulu.com, a premier international marketplace for new digital content on the Internet, with more than 1.1 million recently published titles and more than 15,000 new creators from 80 different countries joining each week. Founded in 2002, Lulu.com is Young’s most recent endeavor. The success of this company has earned Young notable recognition; he was named one of the “Top 50 Agenda-Setters in the Technology Industry in 2006” and was ranked as the fourth “Top Entrepreneur for 2006,” both by Silicon.com. In 1993, Young co-founded Red Hat, the open source software company that gives hardware and software vendors a standard platform on which to certify their technology. Red Hat has evolved into a Fortune 500 company and chief rival to Microsoft and Sun. His success at Red Hat won him industry accolades, including nomination as one of Business Week’s “Top Entrepreneurs” in 1999. Before founding Red Hat, Young spent 20 years at the helm of two computer leasing companies that he founded. His experiences as a high-tech entrepreneur combined with his innate marketing savvy led to Red Hat’s success. His book, “Under the Radar,” chronicles how Red Hat’s open-source strategy successfully won industry wide acceptance in a market previously dominated by proprietary binary-only systems. Young has also imparted the lessons learned from his entrepreneurial experiences through his contributions to the books “You’ve GOT to Read This Book!” and “Chicken Soup for the Entrepreneur’s Soul.

For many years, authors who were unsuccessful in getting their books published by a commercial publishing company could underwrite the costs of publishing their books and sell them through “vanity presses.” It was rare that books published in this manner ever recouped the author’s investment and earned a profit.

Bob Young admits that when he was in college that he never fully appreciated the writings of philosopher Jean Paul Sartre. However, one of Sartre’s teachings—“We see the world the way that we expect to see it”—stuck with him. This passage helps explain how established practices and entities become so entrenched. Yet in 2002, Bob Young had an idea that would attack the established policies and practices of the book publishing industry. The industry had consolidated tremendously in the previous decade, and the distribution and retail networks changed dramatically. These changes have had a profound impact on potential authors. The reduction in the number of publishing entities has resulted in it becoming more difficult for authors to get their works published. The publishing company may already have a similar title or be unwilling to take a chance on an unpublished author. Sometimes, a book is written by a prominent author but the market niche is too small for traditional publishers to serve. These phenomena leave a significant number of high quality books without a publisher.

Read More →

Digital Platforms and Technologies for Publishers: Implementations Beyond “eBook”

We are very happy to announce that we’ve published our new report, Digital Platforms and Technologies for Publishers: Implementations Beyond "eBook." The 142 page report is available at no charge for download here.

From the Introduction:

Much has changed since we decided to write a comprehensive study on the digital book publishing industry. The landscape has changed rapidly during the past months and we have tried to reflect as many of these changes as possible in the final version of our report. For example:

  • Sales of eBooks finally reached their inflection point in late 2008.
  • Customer acceptance of digital reading platforms such including dedicated reading devices like the Kindle and the Sony Reader and mobile devices like the iPhone and the BlackBerry have helped accelerate the market for digital products.
  • The Google settlement, once finally approved by the courts, will substantially increase the supply of titles available in digital formats.
  • New publishing technologies and planning processes are enabling publishers and authors to create digital products that have their own set of features that take full advantage of the digital media and platforms. Embedded context-sensitive search and the incorporation of rich media are two important examples.
  • Readers are self-organizing into reading communities and sharing their critiques and suggestions about which books their fellow readers should consider. This is creating a major new channel for authors and publishers to exploit.
  • Print-on-demand and short-run printing continue to make significant advances in quality and their costs per unit are dropping. These developments are changing the economics of publishing and are enabling publishers to publish books that would have been too risky in the previous economic model.
  • Lower publishing and channel costs are making it possible for publishers to offer their digital titles at lower prices. This represents greater value for readers and fair compensation for all stakeholders in the publishing enterprise.

We are privileged to report such a fine collection of best practices. And we are thankful that so many smart people were willing to share their perspectives and vision with us and our readers. We thank our sponsors for their ardent and patient support and hope that the final product will prove worth the many hours that went into its preparation.

We encourage readers of this report to contact us with their feedback and questions. We will be pleased to respond and try to help you find solutions to your own digital publishing challenges!

Go With the (Enterprise 2.0 Adoption) Flow

People may be generally characterized as one of the following: optimists, realists, or pessimists. We all know the standard scenario used to illustrate these stereotypes.

Optimists look at the glass and say that it is partially full. Pessimists remark that the glass is mostly empty. Realists note that there is liquid in the glass and make no value judgment about the level.

The global Enterprise 2.0 community features the same types of individuals. I hear them speak and read their prose daily, noticing the differences in the way that they characterize the current state of the E2.0 movement. E2.0 evangelists (optimists) trumpet that the movement is revolutionary. Doubters proclaim that E2.0 will ultimately fail for many of the same reasons that earlier attempts to improve organizational collaboration did. Realists observe events within the E2.0 movement, but don’t predict its success or demise.

All opinions should be heard and considered, to be sure. In some ways, the position of the realist is ideal, but it lacks the spark needed to create forward, positive momentum for E2.0 adoption or to kill it. A different perspective is what is missing in the current debate regarding the health of the E2.0 movement.

Consider again the picture of the glass of liquid and the stereotypical reactions people have to it. Note that none of those reactions considers flow. Is the level of liquid in the glass rising or falling?

Now apply the flow question to the E2.0 movement. Is it gaining believers or is it losing followers? Isn’t that net adoption metric the one that really matters, as opposed to individual opinions, based on static views of the market, about the success or failure of the E2.0 movement to-date?

The E2.0 community needs to gather more quantitative data regarding E2.0 adoption in order to properly access the health of the movement. Until that happens, the current, meaningless debate over the state of E2.0 will continue. The effect of that wrangling will be neither positive or negative — net adoption will show little gain —  as more conservative adopters continue to sit on the sideline, waiting for the debate to end.

Anecdotal evidence suggests that E2.0 adoption is increasing, albeit slowly. The surest way to accelerate E2.0 adoption is to go with the flow — to measure and publicize increases in the number of organizations using social software to address tangible business problems. Published E2.0 case studies are great, but until more of those are available, simply citing the increase in the number of organizations deploying E2.0 software should suffice to move laggards off the sideline and on to the playing field.

Busy Week in XML Content Management Market

Holiday weeks can be sleepy weeks in enterprise software news, but this week has seen one significant press release each day in the XML content management market, or component content management (CCM) market if you prefer.

  • On Monday, SDL announced the acquisition of XyEnteprise, and the creation of a new business unit based on XyEnterprise and Trisoft called SDL-XySoft.
  • On Tuesday, Really Strategies, the makers of the Marklogic-Server-based system RSuite, announced the acquisition of SaaS CCM provider DocZone.
  • Today, Quark and EMC announced an integration of Quark XML Author with Documentum.

First, the necessary disclosures and caveats. Of the six companies mentioned, we’ve worked with all of them, I believe, and I actually worked for XyEnterprise back in the 1980s and early 1990s. That said, each of these announcements is significant.

SDL, through both organic growth and acquistion, has grown into a substantial business that spans globalization technology, globalization services, CCM technology, and WCM technology. My colleagues Mary Laplante and Leonor Ciarlone know them much better as a company, but I believe it is safe to say that SDL is in a unique position spanning essentially four markets, but four markets that make a great deal of sense under a single umbrella. The product support content managed in a CCM technology is the best point of integration for globalization/translation tools. A CCM technology is also an excellent underpinning for a global company’s web presence or web precenses (the latter more likely, especially when one considers the need for localized web sites). And services are an essential piece of this puzzle. It’s the rare company that staffs heavily for localization, and even when they do, very few would staff full time to cover all of their language needs. Is SDL in a position to represent one-stop shopping for large companies with complex product content that needs to be localized into many languages? Again, my colleagues could answer that question more precisely, but it’s not a crazy question to ask.

Mary has more on SDL XySoft over in the globalization blog.

The acquisition also breathes new life into XyEnterprise, a company with highly functional, mature technology and excellent executive leadership. We take it as a very positive sign that XyEnterprise CEO Kevin Duffy will become the CEO of the newly combined business unit, reporting to Mark Lancaster, Chairman and CEO of SDL.

The Really Strategies acquistion of DocZone is on a smaller scale of course, but it is is significant in that these two companies represent two leading trends in the CCM marketplace–management of component content in native XML repositories (MarkLogic Server for RSuite and Documentum Content Store for one version of DocZone) and Software as a Service (SaaS). Count me among those who have been skeptical at times about SaaS for CCM, but DocZone, under Dan Dube’s leadership, has made it work. Really Strategies, in the mean time, has developed an impressive CCM offering on top of Mark Logic Server, and they have quietly built up a strong customer list.  We think the combined companies complement each other, and the new management team is excellent, with Barry Bealer as CEO, co-founder Lisa Bos as CTO, Ann Michael in charge of services, and Dan Dube as VP Sales and Marketing.

Which brings us to Quark and EMC. Both companies have been developing more CCM capabilities. EMC acquired X-Hive, and a lot of XML expertise along with it. They have since added more XML expertise on both the product management and engineering side. As they have integrated X-Hive into the Documentum platform, they have logically looked to build out more capabilities and applications for vertical markets. The integration with Quark XML Author makes perfect sense for them, giving their customers and prospects a ready mechanism for XML authoring in a familiar editorial tool.

For Quark’s part, the move is a logical and very positive next step. They had previously announced this kind of integration with IBM Content Manager, which has a strong presence in the manufacturing space. With EMC, Quark now has a strong partner in the pharma space. Documentum has long dominated pharma, and Quark XML Author, under Michael Boses and previous owner In.Vision, had built up a long list of pharma customers. Boses and his team know the pharma data structures inside and out, and it will be interesting to see the details of how Quark XML Author will integrate with Documentum and its storage mechanisms. (I am sure both EMC and Quark see the potential as more than just the pharma market–government is also a good target here–but the pharma angle will be fruitful I am sure.)

So, what news is on tap for tomorrow?

SDL Scores with SDL XySoft

SDL continues its ambitious build-out of technology solutions for end-to-end content globalization with its acquisition of XyEnterprise, announced on 29 June. From Gilbane’s perspective, it’s a win all the way around, especially for buyers who continue to seek solutions for the more difficult obstacles to multilingual, multichannel publishing.

The vendors win. The acquisition brings immediate scale to both XyEnterprise and SDL Trisoft. Both companies were having to work really hard to reach the next level, and both were at risk of very slow progress through organic growth. The deep expertise and market focus of each company are highly complementary–SDL Trisoft with DITA and high tech, XyEnterprise with S100D in aviation and aerospace and a proven track record in commercial publishing. SDL Trisoft gets solid North American support and professional services organizations, and XyEnterprise gains the ability to better serve customers in Europe.

Buyers and customers win. First, the consolidation of two of the leading suppliers of component content management gives buyers a new comfort level with vendor viability. Second, efficient, affordable multilingual, multichannel publishing remains a very expensive obstacle for many global 2000 companies. In Gilbane’s new research on Multilingual Product Content, we identify the multilingual multiplier–costs that are solely the result of producing formatted content in another language. SDL XySoft will be able to address the multiplier problem with tight integration of the XyEnterprise XPP publishing engine, which has been a true differentiatior for Xy throughout its history. Third, existing and new customers will benefit from the extensive combined experience that SDL XySoft has in complex, standards-based publishing and content management. 

The acquisition is also an opportunity to reinforce the core value propostions for XML and component content management. These technologies and practices sit at the nexus of a set of knotty problems: reusing content across applications, repurposing content for different outputs, and translating content for multiple global audiences. A single-vendor, integrated solution that addresses these problems is more evidence that the market is finally making progress towards overcoming the language after-thought syndrome, identified in Gilbane’s new study. Such solutions support the trend towards the:

". . . steady adoption of content globalization strategies, practices, and infrastructures that position language requirements as integral to end-to-end solutions rather than as ancillary post-processes." — Multilingual Product Content, Gilbane Group, 2009

This acquisition should be relatively easy for SDL to absorb, as there’s already an established business unit into which Xy’s capabilities fit (in contrast to SDL’s acquisitions of Trisoft and Tridion, which were completely new businesses for SDL). In addition, SDL XySoft has a proven leader in former XyEnterprise president and CEO Kevin Duffy. Duffy takes the role of XySoft CEO, reporting directly to SDL Chairman and CEO Mark Lancaster. Duffy managed to build a small niche software company into a respected player in its market, surviving through good and bad times. He now get his chance to see what’s possible with the resources of a global organization behind him.

See the SDL press release and the XyEnterprise press release for more information. Gilbane’s study on Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains will be published on the Gilbane site in mid-July. The report is currently available through study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR, Systran, and Vasont.


Assessment of My Enterprise 2.0 Conference Predictions

The Enterprise 2.0 Conference was held last week, in Boston. Prior to the event, I made some predictions as to expected learnings and outcomes from the conference. Today, I will revisit those prognostications to determine their accuracy.

Here is the original list of things that I anticipated encountering at the E2.0 Conference this year. Each prediction is followed by an assessment of the statement’s validity and some explanatory comments:

A few more case studies from end user organizations, but not enough to indicate that we’ve reached a tipping point in the E2.0 market: TRUE The number of case studies presented this year seemed to be roughly the same as last year. That is to say very few. The best one that I heard was a presentation by Lockheed Martin employees, which was an update to their case study presented last year at E2.0 Conference. It was great to hear the progress they had made and the issues with which they have dealt in the last year. However, I was genuinely disappointed by the absence of fresh case studies. Indeed, the lack of new case studies was the number one conference content complaint heard during the event wrap-up session (indeed, throughout the show.)

An acknowledgement that there are still not enough data and case studies to allow us to identify best practices in social software usage:
TRUE This turned out to be a huge understatement. There are not even enough publicly available data points and stories to allow us to form a sense of where the Enterprise 2.0 market is in terms of adoption, much less of best practices or common success factors. At this rate, it will be another 12-18 months before we can begin to understand which companies have deployed social software and at what scale, as well as what works and what doesn’t when implementing an E2.0 project.

That entrenched organizational culture remains the single largest obstacle to businesses trying to deploy social software:
TRUE The "C" word popped up in every session I attended and usually was heard multiple times per session. The question debated at the conference was a chicken and egg one; must culture change to support adoption of E2.0 practices and tools, or is E2.0 a transformational force capable of reshaping an organization’s culture and behaviors? That question remains unanswered, in part because of the lack of E2.0 case studies. However, historical data and observations on enterprise adoption of previous generations of collaboration technologies tell us that leadership must be willing to change the fundamental values, attitudes, and behaviors of the organization in order to improve collaboration. Grassroots evangelism for, and usage of, collaboration tools is not powerful enough to drive lasting cultural change in the face of resistance from leadership.

A nascent understanding that E2.0 projects must touch specific, cross-organizational business processes in order to drive transformation and provide benefit: TRUE I was very pleased to hear users, vendors, and analysts/consultants singing from the same page in this regard. Everyone I heard at E2.0 Conference understood that it would be difficult to realize and demonstrate benefits from E2.0 initiatives that did not address specific business processes spanning organizational boundaries. The E2.0 movement seems to have moved from speaking about benefits in general, soft terms to groping for how to demonstrate process-based ROI (more on this below.)

A growing realization that the E2.0 adoption will not accelerate meaningfully until more conservative organizations hear and see how other companies have achieved specific business results and return on investment: TRUE Conference attendees were confounded by two related issues; the lack of demonstrative case studies and the absence of a clear, currency-based business case for E2.0 initiatives. More conservative organizations won’t move ahead with E2.0 initiatives until they can see at least one of those things and some will demand both. People from end user organizations attending the conference admitted as much both publicly and privately.

A new awareness that social software and its implementations must include user, process, and tool analytics if we are ever to build a ROI case that is stated in terms of currency, not anecdotes:
TRUE Interestingly, the E2.0 software vendors are leading this charge, not their customers. A surprising number of vendors were talking about analytics in meetings and briefings I had at the conference, and many were announcing the current or future addition of those capabilities to their offerings at the show. E2.0 software is increasingly enabling organizations to measure the kinds of metrics that will allow them to build a currency-based business case following a pilot implementation. Even better, some vendors are mining their products’ new analytics capabilities to recommend relevant people and content to system users!

That more software vendors that have entered the E2.0 market, attracted by the size of the business opportunity around social software:
TRUE I haven’t counted and compared the number of vendors in Gartner’s E2.0 Magic Quadrant from last year and this year, but I can definitely tell you that the number of vendors in this market has increased. This could be the subject of another blog post, and I won’t go into great detail here. There are a few new entrants that are offering E2.0 suites or platforms (most notably Open Text). Additionally, the entrenchment of SharePoint 2007 in the market has spawned many small startup vendors adding social capabilities on top of SharePoint. The proliferation of these vendors underscores the current state of dissatisfaction with SharePoint 2007 as an E2.0 platform. It also foreshadows a large market shakeout that will likely occur when Microsoft releases SharePoint 2010.

A poor opinion of, and potentially some backlash against, Microsoft SharePoint as the foundation of an E2.0 solution; this will be tempered, however, by a belief that SharePoint 2010 will be a game changer and upset the current dynamics of the social software market:
TRUE Yes, there are many SharePoint critics out there and they tend to be more vocal than those who are satisfied with their SharePoint deployment. The anti-SharePoint t-shirts given away by Box.net at the conference sum up the attitude very well. Yet most critics seem to realize that the next release of SharePoint will address many of their current complaints. I heard more than one E2.0 conference attendee speculate on the ability of the startup vendors in the SharePoint ecosystem to survive when Microsoft releases SharePoint 2010.

An absence of understanding that social interactions are content-centric and, therefore, that user generated content must be managed in much the same manner as more formal documents:
FALSE Happily, I was wrong on this one. There was much discussion about user generated content at the conference, as well as talk about potential compliance issues surrounding E2.0 software. It seems that awareness of the importance of content in social systems is quite high among vendors and early adopters. The next step will be to translate that awareness into content management features and processes. That work has begun and should accelerate, judging by what I heard and
saw at the conference.

So there are the results. I batted .888! If you attended the conference, I’d appreciate your comments on my perceptions of the event. Did you hear and see the same things, or did the intense after hours drinking and major sleep deficit of last week cause me to hallucinate? I’d appreciate your comments even if you weren’t able to be at E2.0 Conference, but have been following the market with some regularity.

I hope this post has given you a decent sense of the current state of the Enterprise 2.0 market. More importantly, I believe that this information can help us focus our efforts to drive the E2.0 movement forward in the coming year. We can and should work together to best these challenges and make the most of these opportunities.