The Gilbane Advisor

Curated for content, computing, data, information, and digital experience professionals

Page 338 of 931

We Are Smarter Than Me– Report

Last fall, Martin Clifford-CEO of the web community juggernaut Wis.dm, informed me that I was hopelessly out of date regarding the phenomena of web communities and hinted that due to my advanced years I might never comprehend the impact of many-to-many publishing. It’s true that most of my experience is in traditional forms of one-to-many publishing. However, I’ve always loved a good challenge so I began my exploration of the role of communities in the creation of content.. Early in my explorations, I came across the We Are Smarter Than Me project. This project is the joint effort of Pearson Educational Publishing, Wharton, MIT, and Shared Insights. The goal was to form a community that would write a book about how communities could change and enhance the way that companies do business. I tuned into the “Buzz” to get a sense of the passion of the participants And then, I joined the community and contributed a small section on the importance of word-of-mouth in the marketing of services. As the project progressed, I watched its progress and waited eagerly to see what would happen when the many-to-many model was invoked to produce a traditional business book.

To hear first hand accounts of the project, I travelled to the Community 2.0 conference in Las Vegas. Barry Libert of Shared Insights and Tim Moore of Pearson Educational Publishing presented a fascinating progress report and a conversation with co-founder Jon Spector (soon to be CEO of the Conference Board) filled in some additional information.

The participants are to be congratulated for commissioning the project as a pure experiment. As Mr. Moore said, “I just wanted to see what would happen” As one might imagine, the interaction between web communities and large esteemed institutions presented some interesting challenges. Not surprisingly, the first significant issue arose when Pearson faxed their contract to Shared Insights. While the contract was entirely appropriate for traditional author teams, indemnification clauses took on entirely new meaning when the work of hundreds or thousands of author/contributors would be scrutinized. The prolonged wrangling broke the project’s early momentum. It was assumed that the Academic Dream Team of Pearson’s business authors and the faculties of Wharton and MIT would produce numerous thoughtfully written content modules. Surprisingly, none of the authors or profs chose to participate in the project. The project team reverted to Plan B by sending participation invitations to a large list of people affiliated with the sponsoring institutions. The response was enthusiastic and the community began to grow. Current membership is approximately 3500 with 650 individual wiki posts.

As the active participation increased, the project team learned another important lesson. Suddenly the community wanted to take over the project leadership and asked the project team to step aside. Even though the project team knew alot about community dynamics, they weren’t ready for their own community to be so assertive and found it difficult to relinquish control. When they did step back, the community flourished.
How did the book by community turn out? One speaker reported that the journey was more interesting than the destination meaning that the content created was plentiful but uneven in quality and style. To yield an acceptable business book, it would be necessary to hire an accomplished professional author who would also handle the fact checking process.

The open questions and lessons learned from this project.

  1. Why didn’t the authors and professors participate?
    Possible explanations included:
    Generation Gap- Authors and profs didn’t grow up with MySpace or Facebook. Web Communities are foreign to their professional milieu.
    Status Issues- They are used to being the authority and weren’t willing to have their writings publicly challenged. And they have already made their reputation so that they have little status to gain.
    No Financial Benefit- Their time is very valuable and they expect to be paid for their efforts.
    Lack of Passion or Connection with the Project- Community participation is not their avocation nor were they passionate about the topic.
    Those that did participate did so out of a passion for the topic and seemed most motivated by the opportunity to build their reputation within the community. For many members, community participation is one of their hobbies. And they seemed not to desire any remuneration for their contributions.
    Observation- Just like in the early days of the Internet, there is currently more cache attached to eyeballs and recognition than to traditional financial rewards. However, there are significant costs to forming, hosting and moderating communities. And the work of cummunities can be very valuable to companies of all sorts. New business models are emerging that will manage the costs and reflect the value of the contributions.
  2. Given the uneven content and need to bring in a professional author, should anyone even try to write another book by committee?
    It depends on the type of Book!! Wikipedia has demonstrated that this model is very effective in creating a comprehensive reference work. ( I suppose that some purists would argue that Wikipedia isn’t really a book but rather a collection of content modules), For traditional authored book projects, communities might play a valuable role in helping authors research topics that are outside of their primary expertise and in reviewing the authors work for accuracy and clarity.
  3. Will there be instances where community created content modules will compete with traditional published works?
    Given the Google world that we live in, consumers of information often seek a terse answer to a specific question. And there is a definite trend towards the integration of content with the information consumers’ workflows. For these information consumers, A well structured repository of content modules is potentially more valuable than traditional books.
  4. So was the project aiming at the wrong goal?
    Perhaps! Old habits die hard and many people in my generation have books to thank for alot of their professional knowledge. Maybe the goal of the project should have been to develop an outstanding repository of content modules and resources that could become an authorative source of information about communities and their role in changing and enhancing the ways that companies do business. In the long run, the mission critical task is creating outstanding intellectual property. Creating multiple media versions of that IP will allow publishers to reach a wider range of customers.
  5. Will the many-to-many content model put traditional publishers out of business?
    There is much more opportunity than risk for publishers.
    Most of us would agree that we already suffer from information overload. Communities have the potential to raise that overload to an even higher level. Information consumers want to know that the content they are reading is accurate and authoritative. This has been the primary domain of publishers for many years. If publishers find new ways to harness the wisdom of crowds in creating new content and improving existing content, their future is bright. If not, someone else will seize the opportunity. And if they trivialize new methods of content creation as being less pure and authoritative than their time-tested editorial processes, they will face serious consequences. If you’re not convinced, just ask your favorite encyclopedia publisher!!
  6. That is my report on many-to-many versus one-to-many content creation models. Now I’m trying to figure out whether the few-to-few model refers to custom publishing or to an underperforming web community.

It’s the Process, Not the Words: Autodesk Case Study

Leonor summed up Gilbane’s perspective on the real challenge in content globalization in her entry of January 19:

We’ve found that the problem for organizations is less about the act of translation itself, and more about aligning the business processes that support it.

The hard part of globalization isn’t translating one phrase to another. The core problem is the inefficiencies associated with how we do the translating, with how we move words from creation to consumption by their target audience.

Our latest Content Technology Works case study describes how Autodesk, a major software company with worldwide sales of $1.6 billion US, recognized that better processes and higher levels of automation are the critical elements of a scalable globalization strategy. More words in translation memory were important outcomes of its initiatives, but the real benefit to Autodesk is greater competitive advantage as a worldwide software company.

Minette Norman, Senior Software Systems Manager, Worldwide Localization, at Autodesk shares insights in a webinar on April 25, 1:00 pm ET. Registration is now open.

Adobe, IBM, Microsoft and Oracle Executives to Participate in Keynote Panel at Gilbane San Francisco 2007

The Gilbane Group and Lighthouse Seminars announced that executives from Adobe, IBM, Microsoft and Oracle will participate in the Gilbane San Francisco 2007 keynote panel, “Content Technology Industry Update,” on Wednesday, April 11th at 8:30 a.m. at the Palace Hotel. Taking place April 10-12, the Gilbane Conference San Francisco has greatly expanded its collection of educational programs, including sessions focused on web and other enterprise content management applications, enterprise search and information access technologies, publishing technology, wikis, blogs and collaboration tools, and information on globalization and translation technology. The “Content Technology Industry Update” keynote panel will focus on the most important strategic issues technical and business managers need to consider for both near and long term success in managing content and content technologies in the context of enterprise applications. The keynote panel discussion is completely interactive (i.e., no presentations). With six tracks and 35 sessions to choose from, attendees have the opportunity to participate in a conference program focused on educating attendees about the latest content management technologies from experienced content management practitioners, consultants, and technologists. http://gilbanesf.com/conference_grid.html

Public Alpha of Apollo Debuts on Adobe Labs

Adobe Systems Incorporated (Nasdaq: ADBE) announced that the first public alpha version of Apollo is now available for developers on Adobe Labs. Apollo is the code name for a cross-operating system application runtime that allows web developers to leverage their existing skills in HTML, JavaScript and Ajax, as well as Adobe Flash and Adobe Flex software to build and deploy rich Internet applications (RIAs) on the desktop. Apollo provides people with direct access to Internet applications built with HTML, JavaScript, Flash and PDF without the need to open a browser, offering more reliable interaction with content. With Apollo, people can launch applications directly from their desktops and interact with them offline. When a network connection is available, newly created or changed content can synchronize. The first version of Apollo for developers includes a free SDK that provides a set of command line tools for developing and working with Apollo applications. Web developers can use the Integrated Development Environment (IDE) of their choice, including Adobe tools such as Eclipse-based Flex Builder, Flash, and Dreamweaver to build Apollo applications. The alpha version of the Apollo application runtime, required to run Apollo applications, and the Apollo SDK are available immediately as free downloads from Adobe Labs. The Apollo SDK is available in English. The Apollo runtime and SDK are offered for both Windows and Macintosh operating systems, and future versions will be available for Linux. http://www.adobe.com/go/apollo

Good Books, 5 Ways

Is the Caravan Project the right new distribution model for trade publishers? The basic offering is compelling–providing simultaneous access to print, print-on-demand, eBook, chapter eBook, and digital audio versions of titles. The Caravan Project’s publishers include university presses like Yale University Press and nonprofit publishers like Beacon Press.

The Washington Post has a very good and comprehensive article about the project and its executive director, Peter Osnos.

Osnos, a fast-talking, silver-haired man of 63, has been in publishing almost precisely as long as as Politics and Prose has been in business. He left The Washington Post, where he’d been a reporter and editor, for Random House in 1984. Ten years ago he founded Public Affairs, which specializes in the kind of serious nonfiction titles that don’t require six-figure advances to acquire.

Over the years, he became all too familiar with the chief bane of a moderate-size publisher’s existence: the difficulty of getting the right number of books into bookstores at the right time. The advent of digital books, along with greatly improved print-on-demand technology, seemed to offer new ways to address this distribution problem, so a couple of years ago, after stepping down as head honcho at Public Affairs, he began to wrestle with it independently.

The nonprofit Caravan Project — which is supported by the MacArthur, Carnegie and Century foundations — is the result.

To start the experiment, Osnos recruited seven nonprofit publishers, among them academic presses such as Yale and the University of California and independents such as the Washington-based Island Press. Each was to designate titles on its spring 2007 list that would be published in a number of formats simultaneously.

The intriguing idea, to me, of the Caravan Project, is that it is directed at bookstores, with a goal of providing a common platform for them to sell the various formats. The marriage of print distribution with POD is a natural one of course–and Ingram, which is the backbone of the Caravan Project has exactly the infrastructure for that. But adding the eBooks and digital audio is distinctly different, and it gives booksellers the opportunity to be the human conduit for this kind of buying. The potential here is to give booksellers an enormous inventory of product where potentially nothing is truly out of stock.

Of course, the Caravan Project is a finite effort, with seven publishers providing a subset of their current catalogs, but the goal of the project is to try the new model, and see how it impacts the business. According to the Post, Borders sees the potential. “This could be a pilot for what all publishers end up doing eventually,” agrees Tom Dwyer, director of merchandising at Borders. Right now, Dwyer adds, bigger publishers are mainly focused on ‘digitizing all their content.’ But when it comes to distribution, he says, he’s sure they’re “planning something in this direction.”

I think they are too. I blogged about the eBook widget wars recently over at my own blog. The real story there is not the widgets themselves, but the mechanisms for digitization, access, and distribution behind those widgets. Project Caravan is an interesting effort, and one that publishers should watch closely.

Trying to Take the High Road

My last blog was in reaction to two recent vendor experiences. One had just briefed me on an enterprise search offering; the other had been ignoring my client’s efforts to get software support, training and respond to bug reports. The second blogged a reaction with a patronizing: “So Lynda should not feel too bad. I know its (sic) frustrating to deal with vendors but not all vendors are the same and she certainly hasn’t tried us all.”

With dozens of vendors offering search tools, it was fair to assume that I haven’t tried them all. However, having used search engines of all types since 1974 both as a researcher and analyst I have a pretty good sense of what’s out there. Having evaluated products for clients, and for embedded use in products I brought to market for over 20 years, it doesn’t take me long with a new product to figure out where the problems are. I also talk to a lot of vendors, search users, and read more reports and evaluations than I can count. The evidence about any one product’s strengths and weaknesses piles up pretty quickly. “Searching” for stuff about search has been my career and I do make it my business to keep score on products.

I’m going to continue to hold my counsel on naming different search tools that I’ve experienced for the time being. Instead, in this blog I’ll focus on keeping buyers informed about search technologies in general. My work as a consultant is about helping specific clients look at the best and most appropriate options for the search problems they are trying to solve and to help guide their selection process. Here is some quick generic guidance on making your first search tool choice:

  • If you have not previously deployed an enterprise search solution in your domain for the corpus of content you plan to search, do not begin with the highest priced licenses. They are often also the most costly and lengthy implementations and it will take many months to know if a solution will work for you over the long haul.
  • Do begin with one or more low cost solutions to learn about search, search product administration, and search engine tuning. This helps you discover what issues and problems are likely to arise, and it will inform you about what to expect (or want) in a more sophisticated solution. You may even discover that a lower cost solution will do just fine for the intended application.
  • Do execute hundreds of searches yourself on a corpus of content with which you are very familiar. You want to learn if you can actually find all the content you know is there, and how the results are returned and displayed.
  • Do have a variety of types of potential searchers test-drive the installed product over a period of time, review the search logs to get a sense of how they approach searching; then debrief them about their experiences, and whether their search expectations were met.

It is highly unlikely that the first enterprise search product you procure will be the best and final choice. Experience will give you a much better handle on the next selection. It is certainly true that not all vendors or products are the same but you need to do serious reality-based evaluations to learn your most important differentiators.

Enterprise 2.0 & Content

Dan Farber has nicely pulled together a couple of points in a post that suggest the inevitability of “Enterprise 2.0”.

Dan references a post by Euan Semple that has been picked-up by Ross Mayfield, Tim O’Reilly and others, and a post of his own where he reports on some of Don Tapscott’s research: “…the 80 million Net generation young adults coming into the workplace will want to be part of an engage and collaborate model rather than command and control.”

In addition to the demographic fundamentals, there is some kind of a parallel here with the evolution of information technology where the rigid structured data in relational databases is now dwarfed by the unstructured or semi-structured content in content repositories and websites. And also with the increasingly distributed IT function.

(rigidly) structured data -> unstructured data or content
(rigidly) structured organization -> unstructured organization

Do these parallels make Enterprise 2.0 more certain? Well, the fundamentals (the demographics and the new expectations and behavior) are true in a very real sense already. But of course this doesn’t mean that any particular Enterprise 2.0 products or technologies or best practices or methodologies or organizational reengineering will work. Dion Hinchcliffe has an extended thoughtful response that reinforces the fact that wikis etc. are proliferating behind the firewall, but also cautions that enterprise IT is a complex and controlled environment where enterprise 2.0 tools need to find a post-adolescent home.

Communities – Why Should You Care?

I was pleased to attend the inaugural Community 2.0 conference this week. Sponsored by Shared Insights, it was an impressive gathering. Here are some of the highlights:

– John Hegel, the author of Net Gain (and other best sellers) gave his perspective on what has happened in the 10 years since he first wrote on the importance of communities to companies.

His equation for the benefits of communites is as follows: Shared ideas+shared discussions+shared relationships= shared meaning and shared motivation. This leads to higher customer loyalty and feedback that can help facillitate the development of better products and services in the future.

He feels that companies often lack the skillsets required to support successful communities. The key skills lacking are moderating, archiving, and attracting participants. He feels that companies often are afraid to give up the control of the community to the particpants and that is counterproductive.

Like all business practices, communities should be measured. He recommends calculating ROA- return on attention, ROI- Return on Information, and ROS- Return on Skills as the best measures of the impact of communities on the business in general. Space doesn’t permit complete descriptions of these measures. Mr Hagel’s blog and reading list can be found at www.johnhagel.com.

Ben McConnell author of “Church of the Customer” gave a fascinating keynote on the importance of word of mouth in marketing and the importance of communities in generating positive word of mouth. He also reported that only 1 percent of community participants actually contribute entries. However, that can be a large number!! For example, 68,682 individuals contributed to Wikipedia in just one month and 11,420 contributed to Microsofts’s channel nine in a similar time frame. It is amazing how many people are willing to invest their time (while receiving no remuneration) to create information that will be reviewed and scrutinized by many peer reviewers. More examples can be found at ChurchoftheCustomer.com.

Similar statistics were reported during subsequesnt sessions

About.com reports that it has 600 community sites with coverage of over 60,000 topics.
Shawn Gold of MySpace reported some staggering usage figures – They currently have 165 million profiles online that generate 60 Billion pageviews per month. And there are 40,000 videosbeing added to MySpace each day.

The conference finished with a report on the We Are Smarter Than Me project. That will be the subject of another blog entry in the very near future!!

Communities have the potential to help publishers and publishing professionals to create new and different products and to improve the quality of their future products by getting greatly increased customer feedback. Cases and opportunities will be presented at the forthcoming Gilbane Conference in San Francisco from 4/10-4/12.

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑