Curated for content, computing, and digital experience professionals

Year: 2010 (Page 1 of 23)

eBook Publishers: Welcome to Apple Heaven, but Caveat Emptor

Vice President & Lead Analyst Ned May, our Outsell colleague, wrote an excellent Outsell Insights titled “Long Tail Publishers Now Have An Easy Route to Apple’s iBookstore.” The piece covers Apple’s selection of two conversion services for publishers to use to get book titles into the iBookstore. Here’s a key part:

Any publisher looking to get their books on the Apple platform now has an option to have their works converted to the appropriate format by one of two approved firms – Jouve and Innodata Isogen. Further, a publisher can initiate and complete the conversion in a relatively seamless fashion via the Apple iTunes Store site.

Whether these Apple-vetted companies will really offer a more rationalized conversion process than other such services remains to be seen, but May is right that in the controlled universe of the Apple platform, this sure won’t hurt. 

What remains unlikely, at least for the present, is that book content conversion—even under such optimal arrangements—will prove as easy and inexpensive as a publisher might hope. Standards-based ebook formats—ePub, primarily—are not fully standard in the real world, in that various ereader platforms manifest the content in different ways.  Not that this should be a surprise, given the history of technology standards being interpreted flexibly, not to mention the propensity of hardware makers to differentiate from other devices by other manufacturers by adding additional features or capabilities. Apple’s application of the ePub standard falls well within this tradition, although what differentiates the Apple approach from most other efforts is the monotheistic rigor applied: There is but one Apple, and thou shalt not have other platforms… well, you get the point.

And Apple works, it must be said.  The iPad is a nice ereader (among other things), and the Apps store offers the means to sell content of sorts that ePub can’t really handle (as yet, but ePub 3.0 is coming, with enhancements, as it were). Publishers seeking salvation within Apple World are, with the addition of Jouve and Innodata Isogen to the priesthood, one step closer to the promised land. But the reality is that even standard-based ePub format work—the aforementioned Apple-targeted conversion offerings—is hardly going to be simple, easy, or push-button. The $20 dollar conversion fee per title presupposes an existing digital content file so clean and consistently formatted as to be virtually faith-based, but not scientifically likely. There are a lot of quality assurance efforts and other sorts of tweaking any publisher should be budgeting, even in paradise.

There remain many other important issues for book publishers to consider, starting with whether Apple World is enough.  If not, then the Apple offer of easy iBookstore supply of iTunes-based ebook production simply helps ease only one part of a much larger market. Ned May astutely identifies Apple’s ebook conversion partners’ larger hopes, which is that publishers will seek wider ebook platform targets, and, therefore, turn to these same conversion services for help in implementing more basic digital workflow improvement, such as that based on an XML-Early model, which in turn helps the publisher output to a number of present ebook formats, not to mention being better prepared for new or updated and expanded ebook standard formats, across the larger range of ereader-specific display demands.

For Apple, making it easier for book publishers to pursue the Apple ebook devices makes sense, but book publishers need to think long term, and the Apple-controlled system of ebook production and sales is, at present, anyway, a nice short-term solution. Innodata and Jouve are thinking long-term, with their Apple-status providing a very good entrée for selling broader services to publishers.

But as apps and ebooks become more standardized through improvements in base-functionality of ePub and the settling out of ereader device differences, and distribution of both content files and associated marketing content become more rational, publishers may be attracted to more profitable sales channels, and find their Apple-centricity a barrier.  Currently, agency models—touted by Apple, early on—leave 30% of the ebook price with the channel, and this sounds a lot better that traditional publishing wholesale rates that leave 50% of gross revenues on the table. It is true that current channels such as Amazon offer the very important value of supporting the discovery, sale, and download of a title, but as ebook formats and associated bibliographic content work is taken up by the publishers, the real service of existing ebook channels—iBookstore, Amazon, Google eBook, etc.—will become simply transactional (purchase) and file management (download of titles and presentation of bibliographic content) in nature. Such transactional and file management services will be highly automatic, and hence, low-cost to provide, and it is very likely that 30% going to such services will become far too much to pay, with either Amazon and the other currently dominant ebook sales channels competing on margin, or getting beat. 

Sometime soon, as book publishers gain more control over their content workflows, use XML for multiple ebook format production, and better manage bibliographic and other channel-supporting metadata, even 30% margin going to online ebook retailers will be too much. There is an opportunity for content vending sites that gain sufficient revenue only from transactional fees, perhaps in the 5-10% range.

Apple is making iBookstore ebook and apps production easier for publishers, but sticking to the Apple way may make it harder for publishers to succeed more widely in the longer run.

 

Focused on Unifying Content to Reduce Information Overload

A theme running through the sessions I attended at Enterprise Search Summit and KMWorld 2010 in Washington, DC last month was the diversity of ways in which organizations are focused on getting answers to stakeholders more quickly. Enterprises deploying content technologies, all with enterprise search as the end game, seek to narrow search results accurately to retrieve and display the best and most relevant content.

Whether the process is referred to as unified indexing, federating content or information integration, each constitutes a similar focus among the vendors I took time to engage with at the conference. Each is positioned to solve different information retrieval problems, and were selected to underscore what I have tried to express in my recent Gilbane Beacon, Establishing a Successful Enterprise Search Program: Five Best Practices, namely the need to first establish a strategic business need. The best practices include the need for understanding how existing technologies and content structures function is the enterprise before settling on any one product or strategy. The essential activity of conducting a proof of concept (POC) or pilot project to confirm product suitability for the targeted business challenge is clearly mandated.

These products, in alphabetic order, are all notable for their unique solutions tailored to different audiences of users and business requirements. All embody an approach to unifying enterprise content for a particular business function:

Access Innovations (AI) was at KMWorld to demonstrate the aptly named product suite, Data Harmony. AI products cover a continuum of tools to build and maintain controlled vocabularies (AKA taxonomies and thesauri), add content metadata through processes tightly integrated with the corresponding vocabularies, search and navigation. Its vocabulary and content management tools can be layered to integrate with existing CMS and enterprise search systems.

Attivio, a company providing a platform solution known as Active Intelligence Engine (AIE), has developers specializing in open source tools for content retrieval solutions with excellent retrieval as the end point. AIE is a platform for enterprises seeking to unify structured and unstructured content across the enterprise, and from the web. By leveraging open source components they provide their customers with a platform that can be developed to enhance search for a particular solution, including bringing Web 2.0 social content into unity with enterprise content for further business intelligence analysis.

Coveo has steadily marched into a dominant position across all vertical industries with its efficiently packaged and reasonably priced enterprise search solutions, since I was first introduced to them in 2007. Their customers are always enthusiastic presenters at KMWorld, representing a population of implementers who seek to make enterprise search available to users quickly, and with a minimum of fuss. This year, Shelley Norton from Children’s Hospital Boston did not disappoint. She ticked off steps in an efficient selection, implementation and deployment process for getting enterprise search up and running smoothly to deliver trustworthy and accurate results to the hospital’s constituents. I always value and respect customer story-telling.

Darwin Awareness Engine was named the KMWorld Promise Award Winner for 2010. Since their founder is local to our home-base and a frequent participant in the Boston KM Forum (KMF) meetings, we are pretty happy for their official arrival on the scene and the recognition. It was just a year ago that they presented the prototype at the KMF. Our members were excited to see the tool exposing layers of news feeds to hone in on topics of interest to see what was aggregated and connected in really “real-time.” Darwin content presentation is unique in that the display reveals relationships and patterns among topics in the Web 2.0 sphere that are suddenly apparent due to their visual connections in the display architecture. The public views are only an example of what a very large enterprise might reveal about its own internal communications through social tools within the organization.

The newest newcomer, RAMP, was introduced to me by Nate Treloar in the closing hours of KMWorld. Nate came to this start-up from Microsoft and the FAST group and is excited about this new venture. Neither exhibiting, nor presenting, Nate was anxious to reach out to analysts and potential partners to share the RAMP vision for converting speech from audio and video feeds to reliable searchable text. This would enable the unification of audio, video and other content to finally be searched from its “full text” on the Web in a single pass. Now, we depend on the contribution of explicit metadata by contributors of non-text content. Long awaiting excellence in speech to indexing for search, I was “all ears” during our conversation and look forward to seeing more of RAMP at future meetings.

Whatever the strategic business need, the ability to deliver a view of information that is unified, cohesive and contextually understandable will be a winning outcome. With the Beacon as a checklist for your decision process, information integration is attainable by making the right software selection for your enterprise application.

Google eBooks Shows Up, with Amazon for the Web on Its Heels

Google eBooks made its awaited debut this past Monday, and the very next day Amazon presented its own related news.  The upshot, basically, is that ebook formats are now more widely applicable across more devices, be they ereaders, smartphones, or PCs (the Web).

While the news is probably more important in terms of expanding publishing channels for ebooks, the trend is a positive one for book publishers struggling with ebook formatting issues. Unfortunately, in some ways, Google and Amazon’s newest announcements will further confuse the already confusing state of ebook formatting.  See Ebook Formats are Starting to Make Sense… So be Prepared to Remain Confused during the Evolution, in the Gilbane Publishing Practice blog, for more.

Ebook Formats are Starting to Make More Sense… So be Prepared to Remain Confused during the Evolution

Google Editions, now Google eBooks, has finally shown up, and as was widely anticipated, really will be an ebook game-changer. Technically speaking, Google eBooks is not revolutionary, but the promised transparency of ebook formats across a variety of reading environments (Sony and Barnes and Noble ereaders, desktops and notebooks, and various smartphone devices) is a comfort for both publishers and readers. There have been a number of other efforts—some, perhaps, more vapor than real—that promise much the same thing. Baker & Taylor’s Blio, for example, offers many similar benefits.  PDF and ePub, too, claim a degree of trans-device applicability, and then there is the “apps” approach.

And now, again, there is Amazon. Right on the heels of Google’s announcement of Google eBooks, Amazon has followed the tried-and-true marketing playbook maneuver—also of value to yacht racing, I’m told—of taking the wind out of an opponent’s sails. By expanding Kindle for the Web, Amazon now “…enable[s] anyone with access to a web browser to buy and read full Kindle books—no download or installation required,” according to their December 7, 2010 press release. Amazon’s proprietary .AZW format, with its new, wider platform use, is combating the Google eBook promise of making ebooks easier to use “from the cloud.” Most likely, what Amazon is really trying to do is maintain its position relative to the marketplace’s desire to make ebook buying easier. Here’s Amazon’s positioning, from the aforementioned press release, on the matter:

“Kindle for the Web makes it possible for bookstores, authors, retailers, bloggers or other website owners to offer Kindle books on their websites and earn affiliate fees for doing so,” said Russ Grandinetti, Vice President, Kindle Content. “Anyone with access to a web browser can discover the seamless and consistent experience that comes with Kindle books. Kindle books can be read on the $139 third-generation Kindle device with new high-contrast Pearl e-Ink, on iPads, iPod touches, iPhones, Macs, PCs, BlackBerrys and Android-based devices. And now, anywhere you have a web browser. Your reading library, last page read, bookmarks, notes, and highlights are always available to you no matter where you bought your Kindle books or how you choose to read them.”

You could easily swap out Google for Amazon, plus one or two particular details, and be reading t from the Google press release from the day before.  Google’s challenge will be to make buying ebooks as easy as Amazon, and neither the history of Google’s clunky interfaces generally, nor the first iteration of Google eBooks’ own web site specifically, convinces me that this will be a slam-dunk for Google.

Still, it is easy to see why publishers are happy with Google’s entry into ebooks, since it helps further shift selling options for ebooks away from the Amazon-centric model.  As much as ebooks—especially for trade publishers—has happened in large part because Amazon got out in front and put real money up to make the market happen, more sales options, and less format-specific constraits for readers will help everyone.

You can rest assured that book publishers will continue to struggle to get ebook formats right for some time to come, despite the generous hype in these recent announcement. Kindle format limitations will still force publishers to balance quality and production costs, and ePub—heading for its 3.0 version sometime—will still hang up on the different features represented within different ereader devices capabilities.

That is just one reason why Outsell’s Gilbane Group’s Publishing Practice is developing a second study, following on the heels of A Blueprint for Book Publishing Transformation: Seven Essential Systems to Re-Invent Publishing. Ebooks, Apps, and Formats: The Practical Issues will looks at the topic of content format, because this remains a central concern for book publishers pursuing digital publishing programs, and involves a wide-ranging set of issues from editorial and production to distribution and ecommerce.

With Google eBooks and Amazon’s for the Web, ebook formats and their marketplace continue to evolve.  It sure ain’t “push-button” yet, however.

Stay tuned. Drop a note.

Content and the Next-Generation Portal Experience

Last week I was pleased to have my second paper published here at Gilbane "Content and the Next-Generation PortalExperience" that you can now register for and download (for free) from the Beacon area of our website.

For many organizations, access to back office services is becoming an essential part of the experience they need to provide their website visitors.Their external websites form the front line of customer service and their Intranets play a vital role in employee engagement as the expectations rise for both audiences on what they can do over the web. In the paper I discuss how a portal infrastructure can be a natural fit for providing this blend of relevant services and content and there is an opportunity for organizations to shift their portal infrastructure from internal workhorse to a contemporary services interface.

The downside, as many organizations have discovered is that a portal implementation can come at the cost of the primary fuel of web engagement – good quality, fresh, relevant content. In the paper I look at the reasons for behind that and suggest a possible solution of adding a contemporary web content management system.

Like any enterprise integration, the fusing together of a portal platform and a WCM has it’s own risks, principally that the resulting solution does nothing to improve the lot of the content author as it has the potential to expose these business users to multiple interfaces and complex processes. In the paper I go on to take a look at how to avoid and mitigate these risks, with the advice on some key attributes organizations need to look for when selecting the WCM system.

I hope you enjoy the paper and I’d very much like to hear your feedback – either here or you can find me on Twitter (@iantruscott)

 

The paper is now available from the Beacon area of our website and from e-Spirit, who sponsored the paper. You can also register for a webinar that e-Spirit will be hosting on 10th February 2011 during which I will be talking through the main points of the paper.  

 

Google Grabs Aardvark Social Search

Aardvark, a social media search engine, has announced that it has been acquired by Google. Aardvark is now a tool available in Google Labs, and will remain free of cost and fully functional. Aardvark’s defining characteristic as a search engine is that once the user’s question has been input, it will search that user’s social network and attempt to identify a connection who could best answser the question. Under Google Labs, Aardvark is expected to be further developed. http://vark.com/

Coherence and Augmentation: KM-Search Connection

This space is not normally used to comment on knowledge management (KM), one of my areas of consulting, but a recent conference gives me an opening to connect the dots between KM and search. Dave Snowden and Tom Stewart always have worthy commentary on KM and as keynote speakers they did not disappoint at KMWorld. It may seem a stretch but by taking a few of their thoughts out of context, I can synthesize a relationship between KM and search.

KMWorld, Enterprise Search Summit, SharePoint Symposium and Taxonomy Boot Camp moved to Washington D.C. for the 2010 Fall Conference earlier this month. I attended to teach a workshop on building a semantic platform, and to participate in a panel discussion to wrap up the conference with two other analysts, Leslie Owen and Tony Byrne with Jane Dysart moderating.

Comments from the first and last keynote speakers of the conference inspired my final panel comments, counseling attendees to lead by thoughtfully leveraging technology only to enhance knowledge. But there were other snippets that prompt me to link search and KM.

Tom Stewart’s talk was entitled, Knowledge Driven Enterprises: Strategies & Future Focus, which he couched in the context of achieving a “coherent” winning organization. He explained that to reach the coherence destination requires understanding of different types of knowledge and how we need to behave for attaining each type (e.g. “knowable complicated “knowledge calls for experts and research; “emergent complex” knowledge calls for leadership and “sense-making.”).

Stewart describes successful organizations as those in which “the opportunities outside line up with the capabilities inside.” He explains that those “companies who do manage to reestablish focus around an aligned set of key capabilities” use their “intellectual capital” to identify their intangible assets,” human capability, structural capital, and customer capital. They build relationship capital from among these capabilities to create a coherent company. Although Stewart does not mention “search,” it is important to note that one means to identify intangible assets is well-executed enterprise search with associated analytical tools.

Dave Snowden also referenced “coherence,” (messy coherence), even as he spoke about how failures tend to be more teachable (memorable) than successes. If you follow Snowden, you know that he founded the Cognitive Edge and has developed a model for applying cognitive learning to help build resilient organizations. He has taught complexity analysis and sense-making for many years and his interest in human learning behaviors is deep.

To follow the entire thread of Snowden’s presentation on the “The Resilient Organization” follow this link. I was particularly impressed with his statement about the talk, “one of the most heart-felt I have given in recent years.” It was one of his best but two particular comments bring me to the connection between KM and search.

Dave talked about technology as “cognitive augmentation,” its only truly useful function. He also puts forth what he calls the “three Golden rules: Use of distributed cognition, wisdom but not foolishness of crowds; finely grained objects, information and organizational; and disintermediation, putting decision makers in direct contact with raw data.”

Taking these fragments of Snowden’s talk, a technique he seems to encourage, I put forth a synthesized view of how knowledge and search technologies need to be married for consequential gain.

We live and work in a highly chaotic information soup, one in which we are fed a steady diet of fragments (links, tweets, analyzed content) from which we are challenged as thinkers to derive coherence. The best knowledge practitioners will leverage this messiness by detecting weak signals and seek out more fragments, coupling them thoughtfully with “raw data” to synthesize new innovations, whether they be practices, inventions or policies. Managing shifting technologies, changing information inputs, and learning from failures (our own, our institution’s and others) contributes to building a resilient organization.

So where does “search” come in? Search is a human operation and begins with the workforce. Going back to Stewart who commented on the need to recognize different kinds of knowledge, I posit that different kinds of knowledge demand different kinds of search. This is precisely what so many “enterprise search” initiatives fail to deliver. Implementers fail to account for all the different kinds of search, search for facts, search for expertise, search for specific artifacts, search for trends, search for missing data, etc.

When Dave Snowden states that “all of your workforce is a human scanner,” this could also imply the need for multiple, co-occurring search initiatives. Just as each workforce member brings a different perspective and capability to sensory information gathering, so too must enterprise search be set up to accommodate all the different kinds of knowledge gathering. And when Snowden notes that “There are limits to semantic technologies: Language is constantly changing so there is a requirement for constant tuning to sustain the same level of good results,” he is reminding us that technology is only good for cognitive augmentation. Technology is not a “plug ‘n play,” install and reap magical cognitive insights. It requires constant tuning to adapt to new kinds of knowledge.

The point is one I have made before; it is the human connection, human scanner and human understanding of all the kinds of knowledge we need in order to bring coherence to an organization. The better we balance these human capabilities, the more resilient we’ll be and the better skilled at figuring out what kinds of search technologies really make sense for today, and tomorrow we had better be ready for another tool for new fragments and new knowledge synthesis.

« Older posts

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑