Curated for content, computing, and digital experience professionals

Author: Steve Paxhia (Page 1 of 3)

Webinar Wednesday: 5 Predictions for Publishers in 2009

Please join me on a webinar sponsored by Mark Logic on Wednesday 2/18/09 at 2pm EST. I’ll be covering my five top predictions for 2009 (and beyond). The predictions come largely from a forthcoming research study "Digital Platforms and Technologies for Book Publishers: Implementations Beyond eBook," that Bill Trippe and I are writing. Here are the predictions:

  1. The Domain Strikes Back – Traditional publishers leverage their domain expertise to create premium, authoritative digital products that trump free and informed internet content.
  2. Discoverability Overcomes Paranoia – Publishers realize the value in being discovered online, as research shows that readers do buy whole books and subscriptions based on excerpts and previews.
  3. Custom, Custom, Custom – XML technology enables publishers to cost-effectively create custom products, a trend that has rapidly accelerated in the last six to nine months, especially in the educational textbook segment.
  4. Communities Count – and will exert greater influence on digital publishing strategies, as providers engage readers to help build not only their brands but also their products.
  5. Print on Demand – increases in production quality and cost-effectiveness, leading to larger runs, more short-run custom products and deeper backlists.

I look forward to your questions and comments! Register today at http://bit.ly/WApEW

Blogs help Fire Victims in San Diego

During this year’s Spring Gilbane Conference, we were honored to have Chris Jennewein, Vice President, Internet Operations, Union-Tribune Publishing Co as one of our panelists on the topic of the role of Social Computing in adding value to traditional print publications. Little did we know that just a few months later, San Diego would be thrust into a major crisis and that The Union-Tribune’s Sign On San Diego would play a large part helping fire victims find help and support, as well as, locate friends and loved ones. In a recent e-mail these were his comments:
“We’re using blogs, forums, comments and a
“people finder” application to help cover this disaster. Our theory is
that different entry points will appeal to different readers. Here are
links:
http://sosdfireblog.blogspot.com/
http://helpsandiego.blogspot.com/

http://firesearch.latimes.com/people

The San Diego community appears to be very appreciative of both the
round-the-clock coverage and the opportunity to interact.”
This new service goes far beyond what could have been achieved using the company’s website or the traditional newspaper. (They use Mindtouch as their platform) We congratulate Chris and the Union-Tribune for their innovative efforts and are pleased to learn how much Social Computing can help people during times of crisis. Check out the links that Chris provided. My colleague, Geoffrey Bock, and I think that you’ll be impressed!!

Free or Ambient?

It has been a week since the O’Reilly Tools of Change conference adjourned. The topics and presentations were provocative and there are sure to be some lively debates continuing on for months to come…

Several of the keynotes centered on the theme that “information wants to be free”.

Chris Anderson, Editor in Chief of Wired and author of the best seller, “The Long Tail”, was one of the early keynoters. He started the debate by announcing that the title of his next book would be “Free” and would focus upon making a case for providing content to consumers for no charge.

Towards the end of the first day, Jimmy Wales, President of the Wikimedia foundation, spoke about his new company called Wikia. His goal is for Wikia to do for the rest of the library what Wikipedia did for the Encyclopedia section and to make the assembled knowledge of the world available to the masses for free. And by the way, he’s going to produce a free Google competitor at the same time. (He certainly doesn’t lack for ambition.)

Erin McKean of the Oxford University Press closed the conference with a vivid discussion of “book-shaped objects” and openly questioned whether books were the best information package for the future. As a lexicographer, she weighed in on the “free” debate by saying that it would be better to state that all “information wants to be ambient”. She indicated that in this sense ambient means readily available for use. (Because I always had associated the word ambient with sound, lighting, or atmosphere, I checked several other dictionaries to clarify this sense of ambient. It did not appear. When I went to OED online, I found that their definition of ambient was neither free nor ambient but available for a mere $29.95/month. Because, Erin has yet to post her presentation on the O’Reilly site, you’ll have to trust my memory for this definition.) Her point is well taken; the word “free” is simply to vague. The American Heritage Dictionary (AHD) lists 17 different definitions or senses ranging from “matters of liberty” to “lack of restraint” to “lack of encumbrance” to “provided without consideration or reward”.

Mr. Anderson’s talk seemed to stress the economic meaning of “free”. . Thus, we must assume that he means that information or content should be available without consideration or reward. The popular justification for this approach to content valuation is that because cost of digital distribution is negligible, it is unfair to charge for content or information that is essentially free. If the distribution cost method were used to price traditional book content, the cost of printing, paper and binding would be the determining factor. Of course, that’s not the case. That approach omits many of the costs of publishing content including: reviewing, editing, formatting, proofreading, publicizing, selling, and marketing. And, of course, most authors like to be paid royalties for their work. It also neglects to consider the notion that many readers prefer traditional book formats and therefore many content elements will eventually be published in several media formats.

Mr. Anderson is first to admit, his book entitled “Free” won’t actually be free. Okay, the audio book or e-book may be free to those who purchase the printed book.. But if you want to purchase only the e-book or the audio book, you’ll be expected to pay for it. Mr. Anderson is also exploring various forms of online books or printed books that would either be sponsored by corporations or supported by (gasp) advertising. While some of these offerings might be free to the reader, it seems that there will be considerations and rewards built somewhere into this project.

He went on to say that he might well be in favor of making the book free because of the publicity benefits to his consulting practice, speaking fees, and the promotion value to his “personal brand”. However, he felt that his publisher would likely object because they are in the business of generating sales and revenues from selling books. (After the session, I heard several people suggest that Mr. Anderson could self-publish his new work and then he would be free to make it free).

Based upon his work at Wikipedia, Mr. Wales could be seen as a developer of truly free content. The governing organization is a charitable foundation and all of the authors are volunteers. The costs of supporting the staff and technology infrastructure are paid by donations. However, during his keynote, Mr. Wales was quick to point out two major differences between Wikipedia and Wikia. The scope of Wikia is much broader and it has been organized as a for profit entity.

Wikipedia represents the “perfect storm” for a collaborative work or “peering”. According to Wikinomics by Tapscott and Williams, there are three factors necessary to make peering effective: 1) the object of production is information or culture which keeps the cost of participation low for contributors; 2) Tasks can be chunked out into bite-sized pieces that individuals can contribute in small increments and independent of other producers. This makes their overall investment of time and energy minimal in relation to the benefits they receive in return; 3) The costs of integrating those pieces into a finished end product, including the leadership and quality control mechanisms must be low.”

While these criteria would apply to a number of other works found in a library, there are many others (novels, monographs, complex texts, dissertations, etc) that don’t meet these criteria well at all and aren’t likely to be challenged by collaboratative works. The costs associated with “wikiing” the rest of the library would likely be enormous. Mr. Wales seems to be banking on advertising dollars to support this effort. If this were the model, these works might be considered free from a consumer pricing perspective but the advertising revenues and potential profits would have to be deemed consideration in economic terms.

One also wonders whether the volunteer authorship model is extensible. I can see how it would work when the author is writing about a topic that is intensely interesting ( as I am doing at this moment). However, a great deal of content is more the result of craftsmanship than inspiration. It seems unlikely that volunteers could be recruited to write “drudge works”.

And as appealing as it may be to write essays or modules on interesting topics for free, I for one would enjoy it even more if I received some consideration for my hard work. (no wise cracks please). As popular as Wikipedia is today, someday it too may face a challenge from some organization (i.e. Google or Microsoft) that might add new features or devise a revenue sharing model that provides authors with incentives.

I guess my point is that an author’s ideas and created content are products. And like other products they should have a value proposition and a go-to-market strategy. The valuation/pricing options might include: purchase, subscription, sponsorship, syndication, promotion, and free. Depending on the utility, creativity, entertainment value, uniqueness of content, etc of content, one or more of the above models might be appropriate. While ‘free’ is one option, why should it be the preferred choice? Talented people like Mssrs. Anderson and Wales create content that is very good and are entitled to receive proper consideration if they so desire. Jeff Patterson, CEO of Safari Books Online, had some fascinating data on this topic. He studied how their customers for largely technical information value their information products vs. other content some of which are “free”. He found that some quality content was indeed free. However, most “free” content was either advertising supported or was offered in exchange for specific information about the content consumer or their work projects. He asked customers about the tradeoffs that they were willing to make to obtain content for free. If I recall correctly, about ¼ of their customers would rather pay for content if the advertising was inappropriate and/or distracting. !/3 of their customers would rather pay for content than reveal any personal information other that name and e-mail address. And 2/3 of their customers would rather pay for content than reveal in depth information about a project that they were working on. (Patterson’s presentation was one of the best—I hope that he posts his slides)

Advocating for universally free content might indeed have the unwanted effect of reducing the amount of excellent content that is created by professional authors who depend on their writing as their livelihood. I think that Bruce Chizen, Adobe’s CEO, summed it up nicely when asked by Tim O’Reilly where he stood in this debate. He said that Adobe makes many large investments of human and financial capital in the inventions and products that they produce. While he is all in favor of providing some of their intellectual property to consumers, standards organizations, and society for free, he reserves the right to make the decision as to what should be free. I’m sure that his stakeholders support that position.

I’ll conclude this entry with some thoughts on the provocative concept of ambient information. For many years, content was created for a single purpose and was closely regulated against peripheral usages. With the advent of the digital era, came significant opportunities for deriving additional value from content. I will forever be grateful to the senior management of Houghton Mifflin Company who saw the wisdom of freeing the content of their dictionary from exclusively “booklike objects” and allowed linguists and software engineers to build spelling correction technology. In fact, most of the English language spelling technology that is used today was derived from their American Heritage Dictionary database. And as I described in an earlier Blog entry, Thomson’s retiring CEO Richard Harrington made information ambience central to their core strategy and judging from their financial statements, they are receiving significant consideration for their efforts!!

Making content accessible to inform, educate, and entertain people is a worthy goal, Making content more ambient will offer content creators and publishers many new opportunities to publicize their work, create goodwill, answer new questions, solve difficult problems, and the option to generate new income streams that are appropriate and commensurate with the value of the content.

Adobe Digital Editions

At O’Reilly’s Tools of Change Conference (TOC), Adobe made a very unAdobelike product announcement. Their new Digital Editions is very impressive!! I’ve never been a huge PDF fan because it is so stubbornly page centric in a world where pages are becoming much less important to the display of content. It has often awkward and painful to read PDFs on computer screens and hand held devices.

This new technology is based upon the IDPF EPUB standard that has been developed as the universal distribution format for reflow-centric content. The dynamic layout capability is amazingly agile as it reflows content from large to small screens with excellent speed and seemingly miminal effort. Adobe is currently mum about whether it will be included in the IPhone launch.

Digital Editions has optional DRM capability and will support contextual advertising, subscription and membership based business models. It features the expected compatibility with PDF and InDesign CS3.

The functionality and openness to industry standards are a radical departure from many of Adobe’s traditional practices. Bill McCoy, General Manager- ePublishing Business explains that the MacroMedia acquisition played a major role in making this strategic transition possible. This is more evidence that the Macromedia acquisition was one of the better acquisitions in recent memory.

The relatively small download (under 3MB) can be found at: www.adobe.com/products/digitaleditions.

Thomson Learning– What’s next??

Earlier this year, I wrote that the announcement that Thomson Learning was for sale was an indictment of the current fundamentals of most learning market segments. From the perspective of Thomson senior management, the decision was to divest seems clear cut. Consider this comparative financial data:

Thomson Learning All Other Thomson Units

  • Organic Growth 4.0% 6.0%
  • Adj Ebitda 24.5% 29.2%
  • Operating Margin 12.9% 18.9%
  • Electronic Revenues 36.0% 80.0%
  • Recurring Revenues 24.0% 82.0%

(Source Thomson 4th Q Investor Presentation)

The percentages of electronic and recurring revenues are particularly at odds with CEO Harrington’s goal of integrating Thomson’s content with their customer’s work flows. After examining this data combined with declining unit volumes, growing price resistance, and increased government regulation, one wonders what motivated the private equity firms to pay the lofty multiples described in Thad McIlroy’s excellent post earlier this week.

Perhaps, they see the opportunity to create more new products that will blend content and technology to add value to the student’s learning experience. Vivid simulations and multimedia can help bring clarity to the explication of complex topics. Linking the appropriate content to solving problems improves student understanding while saving them lots of time and frustration. Making texts searchable and providing fresh links to appropriate Internet sites brings life and exploration opportunities to static textbook content.

Transitioning from a reliance on the sale of books and specific ancillary items to an intellectual property licensing model that is based upon usage metrics and attributes value to all aspects of course package (including the many package elements currently provided to faculty at no cost) would enable profound changes to the income statement. Revision cycles could be lengthened, sampling and selling costs reduced, and the percentage of recurring revenue increased substantially.

For several years, the potential of such changes have been obvious to industry executives and observers. Why then would the new owners be better able to institute these changes and transitions? The answer is simple, the short term costs of technology investments coupled with the transition to a recurring model would produce some “difficult quarters” for a publicly traded company. The opportunity to retool and restructure while private could create a company that would have excellent recurring revenues and better margins when reintroduced to public markets in a few years.

Should Thomson (and possibly Houghton-Mifflin) adopt this strategy, the impact on the rest of the industry could be profound. However, if these changes were to take place, authors, students, universities, and the publishing companies would eventually all be winners! Here’s hoping that this deal lends impetus to this industry transition.

A New eCollegey in Higher Ed Publishing??

Pearson made an interesting acquisition yesterday. Their acquisition of eCollege continues their corporate foray into Student Information Systems and Course Management. Last year, Pearson acquired PowerSchool and Chancery Software yielding a very strong position in Student Information Systems for the K-12 market. Clearly, they like these learning infrastructure markets for several good reasons.
1. At present, they seem to be solid businesses with only a few competitors that are poised to grow at rates exceeding their traditional textbook businesses.
2. The acquired customer base brings them many new customers and brings them closer to the students (and parents) who use their instructional products. The information about these students and the ability to reach them with additional product offerings is not to be underestimated in this digital world.
3. As the range of course materials such as content modules, learning software, simulations, educational websites, etc. continues to grow, the value of the course infrastructure technology will increase as well as provide a strategic advantage for integration with their broad range of course materials.
Last week at the Digital Book conference in New York, several speakers agreed that college textbook publishers will look more and more like software publishers over the next ten years. The reasons for this transition will center on using technology to: 1. deliver appropriate content to the student when it is needed to solve homework problems and prepare for tests; 2. integrate traditional material with innovative simulations and learning modules available from communities like MERLOT; 3. add life to static published content by enabling further exploration via web links and domain specific search engines and content repositories.
Pearson is wise to acquire successful software and technology companies to give them the pockets of technical expertise that would take many years to develop within the company. While there may be some culture clashes, this strategy should serve Pearson well and position them to maintain or expand their leadership position in educational publishing.

The News in Retrospect

When I was much younger, I lived in Upstate NY and was vexed by a certain Gannet Newspaper whose news wasn’t particularly current. I always said that their motto should be “the news in retrospect”.

Now I do some writing in the form of this blog and am embarrassed to admit that my report on the recent Gilbane Conference in SanFrancisco would be covered by the same motto. Age makes us humbler with every passing year.

I was very pleased with the quality of presentations in this year’s Publishing Track. In his recent post, Thad McIlroy was much too modest in his depiction of his impressive Future of Publishing Website. The result of almost 10 years of hard work, the site is a fascinating compendium of past and current views of the future of publishing. It is impressive in its scope, organization, and innate wisdom. We were honored to have it released to the public at our conference.

Thad did his usual outstanding job in leading a panel that gave a crisp and concise view of what is possible today in the world of publishing automation. As publishers, Thomson and O’Reilly distinguished themselves with the processes they are using today and products that resulted from those processes. Their willingness to completely rethink their strategies and re-engineer their processes should prove an inspiration to other publishers.

As you can see from my previous post on We are Smarter than Me, I am very interested in activities at the intersection of communities and publishing entities. Our Panel with representatives of San Diego Union Tribune, MERLOT, and Leverage Software gave vivid examples and insights as to how communities can develop valuable new information or enhance traditional information products. Their talks further fueled my curiosity and thinking on this topic.

Bill Rosenblatt led a great Panel of representatives from Adobe, Mark Logic, Marcinko Enterprises, and Quark through an excellent discussion of how today’s technology can enable publishers to design and implement processes that support true cross media publishing. And then Bill shared the lessons that were learned in an innovative cross-media strategy project that he did with Consumer’s Union. He was joined by Randy Marcinko who cited several clear examples of how the proper processes support cross media publishing and By Chip Pettibone Safari U’s Vice President of Product Development who dazzled the audiance with some of Their new products and business models . Their Rough Cuts and Short Cuts product lines are particularly impressive!

Finally Thad’s posting speaks glowingly of the panel for the International Publishing panel. I concur!!
Thanks to all conference panelists and attendees!! Please send me any comments and critiques that would make the next conference more valuable to you.

A Kodak Moment

Last month, I had the privilege of being a guest lecturer at MIT for Howard Anderson and Peter Kurzina’s Course entitled “Managing in Crisis”. I prepared a case study about the current status of the College Publishing market. It included concerns about price pressures, used book competition, channel issues, new competitors and new media requirements, pending legislation about pricing practices, and the continued lack in the growth of unit sales.

The class was composed of 40 or so very bright students who did an excellent job analyzing the case. One student really captured the essence of the case when she said that it reminded her of the photography industry as the major players struggle with the transition from traditional film and paper products to digital photography.

While Kodak moments have long been associated with joyful celebrations, the aforementioned transition has been anything but a celebration. In fact, it is likely that this Kodak moment will come to exemplify the struggles of a powerful corporation as they strive and perhaps even fail due their inability to recognize the customer benefits, opportunities, and challenges associated with non-traditional types of photographic media.

Many publishers are struggling with their own “Kodak Moments”. And I think that the transition issues are similar to the photography industry. First, Kodak seems to have been focused on the products that people had been buying for years and trying to preserve the advantages that their film and paper technologies provided. Like many of us who have had market leading products, they were arrogant about their technology, processes, market position and quality. While they clearly were aware of digital photography technology, they dismissed digital products because the image quality was significantly poorer. Then they concentrated on turning digital photos into traditional photos. It seems to met hat they missed the potential in offering less expensive digital images that could easily be posted on the Internet or e-mailed to relatives.

For many years, publishers offerings have been closely related to technological developments in the Software and Printing industries. For example, software has enabled improvements in authoring, composition, and thereby lowered the costs of elegant or complex page designs. Printing Technology has made four color printing much more affordable and had made shorter print runs economical. These changes have been passed along to consumers of information. (While I understand that there are differences between the terms: content, information, and intellectual property, I will use the term information to subsume all three terms for purposes of brevity) In many cases, they have added value to the customer’s experience but there are cases where formats were enhanced and color offered because they could be rather than because they were beneficial. In reality, I believe that the net result was that the size, format, and frequency of the traditional economical delivery unit EDU (my term) of information (a book, journal, or magazine) were modified by technology advances but the traditional media form has remained essentially the same for 100+ years.

The Internet has presented publishers with a radical paradigm shift ( I don’t like the term either). All types of publishing entities have had to deal with changes in customer expectations that are easily as profound as those experienced by the Photography industry. They don’t just want their information to be more timely and less expensive, they also want their information to concisely answer their questions and seamlessly integrate with their work flows or learning styles.

Perhaps the most significant change is the redefinition of the EDU. In the purely print era, there needed to be a certain mass of information to build a product that would be economical to print and sufficiently valuable to consumers to generate a profit. In manycases, it was assumed that relatively few information consumers would use all of the information that was presented in a single EDU. Rather, the scope of the information (or content) had to be broad enough to attract enough customers without being so broad as to make customers feel that they were procuring too much information that wasn’t pertinent to their interests. Hence, we have witnessed a generation of books where authors and market researchers work closely together.

In the digital world, authors and publishers are potentially freed from the strictures of printing economies. Therefore, information currently found in textbooks, references, magazines and journals can be rendered as as short information objects or more comprehensive content modules. Or publishers can produce information objects or content modules that are not anticipated to ever take book form. The objects can be delivered in many ways including search engines such as Google. These new EDUs can be purchased or licenses separately or mixed and matched to create a course of instruction or a personal reference work. One benefit of these nimbler EDUs is that they blend nicely with software to offer increased value in the form of better instruction or more productive work flows.

The availability of these more compact EDUs will likely spawn many debates concerning academic traditions and learning methodologies that we have come to hold dear. It has long been the practice for students to read and master significant quantities of information with the expectation that many of the specific facts will fade from memory leaving a general understanding of the topic. And many people are considered well read because they have plowed through many traditional EDUs (Books) The question will be: Could one become well educated or well read by learning to explore topics of interest through smaller EDUs and/or what blend of contextual and specific information delivers the best and most productive intellectual outcome. There will be some interesting face-offs between technology enabled active exploration and discovery of information to allow students to pursue topics that they find interesting vs. the more structured mastery of a set of information presented in book form. Of course, it is not an either/or proposition as they methods must eventually be blended to enable meaningful knowledge acquisition.

The digital world has also created a demand for information that is developed and delivered as rapidly as possible. Where traditional publishers often justified their value by guaranteeing the accuracy and authority of their published information, many of today’s information consumers are willing to trade authority for velocity of information and now rely upon other information consumers to tell them what information is the most accurate and useful. Individuals now actively participate in communities that generate and evaluate large quantities of information objects and content modules. The Wikipedia/media organization and the MERLOT community are excellent examples of communities that produce quality information modules.

While many information consumers may still prefer to consume their information in print form, they may now wish to print their own copies or to create and purchase custom versions produced by rapidly improving print on demand technology. To many publishers, the perfectly formatted page has become almost an art form. They consider those pages to have many of the same aesthetic values that Kodak attributed to images produced via their traditional film technologies. Because customers rarely have the choice of formats, it is difficult to gauge the value that they derive from “perfect pages” vs potentially less expensive simpler pages. Chip Pettibone of O’Reilly Publishing reported at the recent Gilbane Conference that when readers of e-books were offered the choice between a simple HTML design and a faithful rendition of the original book page, 50% chose the HTML version and that population seemed to be gaining in numbers. Because books and computer screens represent quite different form factors, the value of the perfect page can actually limit rather that enhance the effective presentation of information in digital formats. Therefore, rather than trying to maintain the integrity of the printed page, modern publishers are designing their content to be presented equally well in a variety of media forms. Publishers that cling to the page metaphor are putting their futures in jeopardy.

This paradigm shift is replete with challenges and opportunities. Many traditional reference products (including Microsoft’s Encarta) have been decimated by new products created in the Internet Era. Newspapers and magazines have had to adapt to the challenges or multiple media environments by creating online products as they have seen their traditional readership dwindle. Journal publishers have had to derive new models to serve their subscriber base. Many categories of trade books now include websites with fancy multimedia elements and discussion groups. I think that some of the most exciting and interesting challenges and opportunities will be found in the world of educational publishing. As witnessed by the decisions of major publishing conglomerates to divest their educational publishing operations, the challenges of mastering the Internet paradigm shift are both daunting and expensive. To succeed, new generations of products will need to be built to take advantage of technology as opposed to being web versions of existing products. Business models will need to be revised and channel strategies re-engineered. One important outcome will be increased information accessibility for readers and learners with disabilities.

Over the next few years, the publishing industry will witness many Kodak moments…. Hopefully the majority will be the old fashioned Kodak moments of victory and celebration.

« Older posts

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑