Curated for content, computing, and digital experience professionsals

Year: 2010 (Page 2 of 37)

Content and the Next-Generation Portal Experience

Last week I was pleased to have my second paper published here at Gilbane "Content and the Next-Generation PortalExperience" that you can now register for and download (for free) from the Beacon area of our website.

For many organizations, access to back office services is becoming an essential part of the experience they need to provide their website visitors.Their external websites form the front line of customer service and their Intranets play a vital role in employee engagement as the expectations rise for both audiences on what they can do over the web. In the paper I discuss how a portal infrastructure can be a natural fit for providing this blend of relevant services and content and there is an opportunity for organizations to shift their portal infrastructure from internal workhorse to a contemporary services interface.

The downside, as many organizations have discovered is that a portal implementation can come at the cost of the primary fuel of web engagement – good quality, fresh, relevant content. In the paper I look at the reasons for behind that and suggest a possible solution of adding a contemporary web content management system.

Like any enterprise integration, the fusing together of a portal platform and a WCM has it’s own risks, principally that the resulting solution does nothing to improve the lot of the content author as it has the potential to expose these business users to multiple interfaces and complex processes. In the paper I go on to take a look at how to avoid and mitigate these risks, with the advice on some key attributes organizations need to look for when selecting the WCM system.

I hope you enjoy the paper and I’d very much like to hear your feedback – either here or you can find me on Twitter (@iantruscott)

 

The paper is now available from the Beacon area of our website and from e-Spirit, who sponsored the paper. You can also register for a webinar that e-Spirit will be hosting on 10th February 2011 during which I will be talking through the main points of the paper.  

 

Google Grabs Aardvark Social Search

Aardvark, a social media search engine, has announced that it has been acquired by Google. Aardvark is now a tool available in Google Labs, and will remain free of cost and fully functional. Aardvark’s defining characteristic as a search engine is that once the user’s question has been input, it will search that user’s social network and attempt to identify a connection who could best answser the question. Under Google Labs, Aardvark is expected to be further developed. http://vark.com/

Coherence and Augmentation: KM-Search Connection

This space is not normally used to comment on knowledge management (KM), one of my areas of consulting, but a recent conference gives me an opening to connect the dots between KM and search. Dave Snowden and Tom Stewart always have worthy commentary on KM and as keynote speakers they did not disappoint at KMWorld. It may seem a stretch but by taking a few of their thoughts out of context, I can synthesize a relationship between KM and search.

KMWorld, Enterprise Search Summit, SharePoint Symposium and Taxonomy Boot Camp moved to Washington D.C. for the 2010 Fall Conference earlier this month. I attended to teach a workshop on building a semantic platform, and to participate in a panel discussion to wrap up the conference with two other analysts, Leslie Owen and Tony Byrne with Jane Dysart moderating.

Comments from the first and last keynote speakers of the conference inspired my final panel comments, counseling attendees to lead by thoughtfully leveraging technology only to enhance knowledge. But there were other snippets that prompt me to link search and KM.

Tom Stewart’s talk was entitled, Knowledge Driven Enterprises: Strategies & Future Focus, which he couched in the context of achieving a “coherent” winning organization. He explained that to reach the coherence destination requires understanding of different types of knowledge and how we need to behave for attaining each type (e.g. “knowable complicated “knowledge calls for experts and research; “emergent complex” knowledge calls for leadership and “sense-making.”).

Stewart describes successful organizations as those in which “the opportunities outside line up with the capabilities inside.” He explains that those “companies who do manage to reestablish focus around an aligned set of key capabilities” use their “intellectual capital” to identify their intangible assets,” human capability, structural capital, and customer capital. They build relationship capital from among these capabilities to create a coherent company. Although Stewart does not mention “search,” it is important to note that one means to identify intangible assets is well-executed enterprise search with associated analytical tools.

Dave Snowden also referenced “coherence,” (messy coherence), even as he spoke about how failures tend to be more teachable (memorable) than successes. If you follow Snowden, you know that he founded the Cognitive Edge and has developed a model for applying cognitive learning to help build resilient organizations. He has taught complexity analysis and sense-making for many years and his interest in human learning behaviors is deep.

To follow the entire thread of Snowden’s presentation on the “The Resilient Organization” follow this link. I was particularly impressed with his statement about the talk, “one of the most heart-felt I have given in recent years.” It was one of his best but two particular comments bring me to the connection between KM and search.

Dave talked about technology as “cognitive augmentation,” its only truly useful function. He also puts forth what he calls the “three Golden rules: Use of distributed cognition, wisdom but not foolishness of crowds; finely grained objects, information and organizational; and disintermediation, putting decision makers in direct contact with raw data.”

Taking these fragments of Snowden’s talk, a technique he seems to encourage, I put forth a synthesized view of how knowledge and search technologies need to be married for consequential gain.

We live and work in a highly chaotic information soup, one in which we are fed a steady diet of fragments (links, tweets, analyzed content) from which we are challenged as thinkers to derive coherence. The best knowledge practitioners will leverage this messiness by detecting weak signals and seek out more fragments, coupling them thoughtfully with “raw data” to synthesize new innovations, whether they be practices, inventions or policies. Managing shifting technologies, changing information inputs, and learning from failures (our own, our institution’s and others) contributes to building a resilient organization.

So where does “search” come in? Search is a human operation and begins with the workforce. Going back to Stewart who commented on the need to recognize different kinds of knowledge, I posit that different kinds of knowledge demand different kinds of search. This is precisely what so many “enterprise search” initiatives fail to deliver. Implementers fail to account for all the different kinds of search, search for facts, search for expertise, search for specific artifacts, search for trends, search for missing data, etc.

When Dave Snowden states that “all of your workforce is a human scanner,” this could also imply the need for multiple, co-occurring search initiatives. Just as each workforce member brings a different perspective and capability to sensory information gathering, so too must enterprise search be set up to accommodate all the different kinds of knowledge gathering. And when Snowden notes that “There are limits to semantic technologies: Language is constantly changing so there is a requirement for constant tuning to sustain the same level of good results,” he is reminding us that technology is only good for cognitive augmentation. Technology is not a “plug ‘n play,” install and reap magical cognitive insights. It requires constant tuning to adapt to new kinds of knowledge.

The point is one I have made before; it is the human connection, human scanner and human understanding of all the kinds of knowledge we need in order to bring coherence to an organization. The better we balance these human capabilities, the more resilient we’ll be and the better skilled at figuring out what kinds of search technologies really make sense for today, and tomorrow we had better be ready for another tool for new fragments and new knowledge synthesis.

What is Your Ebook Format?

Bill Trippe and I were speaking with someone at a mid-sized education publisher the other day, and we heard a well-informed and articulated series of complaints about the Kindle format. The frustration behind these comments was palpable: Kindle is where so much of the early and encouraging growth of the ebook market has happened, but the E-ink display and .AWZ format is really not good for any content beyond straight-forward text constrained within narrative paragraph structure.  While such constraint works fine for many titles, any publisher producing educational, professional, STM, or any other even moderately complex content has to compromise way too much.

Book publishers still not committing to the ebook market certainly like the news of the potential—Forrester’s James McQuivey, with the projection of the ebook market hitting $3 billion in sales sometime soon, is the latest word, perhaps—but these same book publishers, who after all, are the ones having to do the work, find themselves wondering if they can get there from here. Hannah Johnson, at PublishingPerspectives, posted a blog titled “Forrester’s James McQuivey Says Digital Publishing is About Economics, Not Format” The post is on the post by James McQuivey of Forrester Research about the projected growth of ebook sales and the emphasis on economics, not formats, when assessing ebooks’ future.

McQuivey’s point is right, of course, although it isn’t a startling conclusion, but one more on par with pointing out that, for print books, it matters very little whether the title comes out in hard cover or paperback, or in one trim size over another.  Still, in today’s ebook hysteria, it remains valuable to point out the sensible perspective.

In book publishing, the main consideration is producing a book that is of strong interest to readers, while also making sure that these readers can get their hands on the title in ways that produce sufficient monetary gain for the publisher. The only reasons why ebook formats are such a concern at the moment is that the question of ebook formats is a new one that book publishers are struggling to figure out how to implement, even while the marketplace for any and all such ebook formats remains nascent.

The Gilbane Group has been in the business of helping companies with all kinds of content—including publishers of many stripes—more effectively manage their content and get it to those who need it, at the right time, in the right form. Our recent 277-page study, A Blueprint for Book Publishing Transformation: Seven Essential Systems to Re-Invent Publishing (which is free, for download, by the way), discusses the issue of ebook formats and makes the point that book publishers need to move toward digital workflow—and, preferably, XML-based—as early in the editorial and production process as possible, so that all desirable formats the publisher may want to produce now and in the years ahead can be realized much more efficiently and much less expensively. One section in our industry forecast chapter is titled, “Ebook Reader Devices in Flux, But so What?”

But good strategic planning in book publishing doesn’t necessarily resolve each and every particular requirement for market success, and given the confused state of ebook format s and their varying production demands, we’re developing our next study that drills down on this very issue.  Working title: Ebooks, Apps, and Formats: The Practical Issues.

Stay tuned.  Drop a line.

 

Publishers are the Masters of Publishing

As absurd as the title of this entry sounds, there is a point to it, especially when you consider all the theories being bandied about the consequences of book publishing’s encounter with ebooks specifically, and digital publishing generally. The sheer range in such theories is impressive, from the “print is dead” silliness (unless, perhaps, you are casting well into the future) to much more reasonable suppositions that book publishers as they are today may be in danger of disintermediation tomorrow (or, rather, 5, 10, or more years down the road), as digital technologies may engender significant shifts in the supply and value chains presently in place. There’s plenty of compelling evidence that real alternatives will exist, and are found in our newest study on book publishing and ebooks, A Blueprint for Book Publishing Transformation: Seven Essential Processes to Re-invent Publishing, now available for free download. One such statistic is from Bowker’s own analysis of the book industry, where it reports the number of new fiction titles being traditionally published dropped significantly from 2008 to 2009, while the overall number of book titles published, including fiction, exploded due to non-traditional publishing efforts such as “self-publishing” and ebooks. 

Don’t write off book publishers yet, however.  We see that book publishing across all segments—from trade, to educational, STM, and professional—have been making good progress, especially in approaching digital workflow as a necessary process improvement, and, even with the use of XML for content creation and management within such workflows. While the industry as a whole still can’t be thought of as all that fast moving (after all, many book publishers still take a year or two to get a signed book into the hands of readers), speculation that these “dinosaurs” are doomed is simply unsupportable.

Blueprint provides an in-depth look at publisher responses to digital mandates, identifies winning strategies for ebook technologies, processes, and systems. One of the sponsors of this multi-sponsor study is the Book Industry Study Group (BISG), and in a letter to its members about the recently published Blueprint, BISG’s Executive Director, Scott Lubeck writes:

We all hear a great deal about change in our industry, but very little on how to accomplish this in a constructive way. The key to managing change is not mastering technology—however important that may seem to be—but rather in mastering process.  If you don’t understand the processes that underlie and drive your businesses you can’t change them let alone improve them, you can only watch them collide, and in the worst cases implode, as new opportunities emerge or new competencies are required.

It is the very fact that book publishing entails many processes which places the industry in the captain’s chair, even as self-publishing has its role to play as more and more services are available to self-publishers that reflect the wide range of processes (e.g., think promotion and marketing, for one) involved in book publishing. Book publishers know their business processes, and there is little that is simple about most publishing processes. Lubeck writes of one element of publishing that key to improving many publishing processes:

Good process and process awareness produce enormous value in the book industry value chain. The most salient example to my mind is metadata. Metadata is not one thing: there are bibliographic metadata, production metadata, marketing metadata, product metadata, just for starters; and all metadata maps to the core publishing process that produce it. If you want to improve metadata—and you better had in order to succeed in the digital world—you have to understand and improve these processes. The technologies follow.

Now, I’m not going to suggest that book publishing as a whole—and more so in some segments than others—has no significant challenges. Book publishing carries, in many of the companies, high debt loads, and the overall margins can be modest. And long-established industries—think music recording—can all too easily be their own worst enemies, refusing to respond to changing market realities, and there’s no guarantee that book publishing won’t be equally stupid.

But it looks good so far, perhaps because publishing has long been struggling with debt and margins and has been desperate for reducing costs and increasing revenue. Digital technologies, when applied in service to publishing processes that are sound, serve these ends quite effectively. 

Let us know what you think. Leave a comment. Drop a line.

« Older posts Newer posts »