The Gilbane Advisor

Curated for content, computing, and digital experience professionals

Page 324 of 918

Is Adobe Really Going to Mars?

I’m becoming concerned whether Adobe is really serious about Mars. My evidence:

1. The FAQ has not be updated since 27 Oct 2006.

2. At the end of the FAQ it reads:

Q: When will the Mars format be frozen for 1.0?
A: A date for this has not yet been set.
Q: When will the Mars plug-in be available?
A: It is planned to be available before the end of the year.

This all seems very tentative.

As Joe Wilcox observes here:
“Adobe’s competitive response to XPS makes sense. PDF’s heritage predates the populist Web, and Adobe created the format for the purpose of mimicking paper documents. In the 21st century, however, digital documents are often containers that likely will never be printed. Paper’s relevance — and so the need to mimic — has greatly diminished.”
All of this is, on the surface, true (except perhaps the “makes sense” part). Does tossing high-fidelity page-oriented PDF into an XML container really address this issue?

I think even more significant is Adobe’s clearly stated, and obviously honestly intended, design to make PDF an ISO standard. My cynical blog entry is here: . But, as Adobe and others point out, subsets of PDF, very useful subsets I’d say, are already ISO standards, including “Several trade-specific subsets of the PDF spec are either ISO approved or in the approval process, including PDF/X for printers, PDF/E for engineers, PDF/A for archivists and PDF/UA for making documents compliant with Section 508 regulations.” (http://www.eweek.com/article2/0,1895,2088277,00.asp).

These aspects of PDF, including, of course, the entire spec, only serve to “encrypt” as standards that which makes PDF uniquely the “Portable Document Format.” PDF as an XPS competitor is not uniquely PDF, in the historic sense of the format, and with the entire PDF spec submitted to ISO, Mars needs to succeed within this process. Do we really think it can?

I think that Adobe is far more interested in Apollo (http://labs.adobe.com/technologies/apollo/). Although this is a very different beast than Mars, I believe Adobe knows that its future lies much moreso with this kind of technology than it does on the not-very-hospitable planet Mars.

New Research on Enterprise Social Software Use

Finally there is some quantitative research on enterprise use of blogs, wikis, tagging, etc. to complement the very informal surveys we have taken, and the work done at the University of Massachusetts. Reports from Forrester (CIOs Want Suites For Web 2.0) and McKinsey (How businesses are using Web 2.0: A McKinsey Global Survey) published this week provide interesting, though not surprising, data. The McKinsey report is free with registration, and the Forrester report isn’t expensive.

I haven’t read the Forrester report (119 CIOs), but the executive summary focuses on their finding that most CIOs want to buy enterprise social software in suite form from large vendors rather from the smaller specialist software vendors. This fact itself is of course totally predictable, but it raises two interesting issues. First, just what are all the larger vendors, as well as midsize (e.g., content management vendors) doing about all this? (Short answer – all are doing something, but the details are often vague.) Second, what will be lost or gained in the process of force-fitting the “engage and collaborate” functions and culture into the “command and control” (last week’s post) of top-down IT directives?

The McKinsey report (2847 executives, 44% C-level) found “widespread but careful interest” in “Web 2.0 technologies”, and that they are strategic and will be invested in. I think their conclusion might be a little overly conservative given their findings. For example, 77% of retail and 74% of high tech plan to increase investment in these technologies. Note, however that McKinsey includes web services as a “Web 2.0” technology which not everyone would agree with.

See comments on these reports from Nick Carr, who points out where the Forrester and McKinsey findings differ. And see Richard MacManus’ comments on what the Forrester findings mean for the startups in this space.
For a couple of vendor perspectives, Socialtexts’ Ross Mayfield covers these findings here, and FAST’s Hadley Reynolds talks about some similar research they have been working on with the Economist here.
Also (while not commenting on these reports) Andrew McAfee provides some info on how he is seeing enterprises using these technologies.

We Are Smarter Than Me– Report

Last fall, Martin Clifford-CEO of the web community juggernaut Wis.dm, informed me that I was hopelessly out of date regarding the phenomena of web communities and hinted that due to my advanced years I might never comprehend the impact of many-to-many publishing. It’s true that most of my experience is in traditional forms of one-to-many publishing. However, I’ve always loved a good challenge so I began my exploration of the role of communities in the creation of content.. Early in my explorations, I came across the We Are Smarter Than Me project. This project is the joint effort of Pearson Educational Publishing, Wharton, MIT, and Shared Insights. The goal was to form a community that would write a book about how communities could change and enhance the way that companies do business. I tuned into the “Buzz” to get a sense of the passion of the participants And then, I joined the community and contributed a small section on the importance of word-of-mouth in the marketing of services. As the project progressed, I watched its progress and waited eagerly to see what would happen when the many-to-many model was invoked to produce a traditional business book.

To hear first hand accounts of the project, I travelled to the Community 2.0 conference in Las Vegas. Barry Libert of Shared Insights and Tim Moore of Pearson Educational Publishing presented a fascinating progress report and a conversation with co-founder Jon Spector (soon to be CEO of the Conference Board) filled in some additional information.

The participants are to be congratulated for commissioning the project as a pure experiment. As Mr. Moore said, “I just wanted to see what would happen” As one might imagine, the interaction between web communities and large esteemed institutions presented some interesting challenges. Not surprisingly, the first significant issue arose when Pearson faxed their contract to Shared Insights. While the contract was entirely appropriate for traditional author teams, indemnification clauses took on entirely new meaning when the work of hundreds or thousands of author/contributors would be scrutinized. The prolonged wrangling broke the project’s early momentum. It was assumed that the Academic Dream Team of Pearson’s business authors and the faculties of Wharton and MIT would produce numerous thoughtfully written content modules. Surprisingly, none of the authors or profs chose to participate in the project. The project team reverted to Plan B by sending participation invitations to a large list of people affiliated with the sponsoring institutions. The response was enthusiastic and the community began to grow. Current membership is approximately 3500 with 650 individual wiki posts.

As the active participation increased, the project team learned another important lesson. Suddenly the community wanted to take over the project leadership and asked the project team to step aside. Even though the project team knew alot about community dynamics, they weren’t ready for their own community to be so assertive and found it difficult to relinquish control. When they did step back, the community flourished.
How did the book by community turn out? One speaker reported that the journey was more interesting than the destination meaning that the content created was plentiful but uneven in quality and style. To yield an acceptable business book, it would be necessary to hire an accomplished professional author who would also handle the fact checking process.

The open questions and lessons learned from this project.

  1. Why didn’t the authors and professors participate?
    Possible explanations included:
    Generation Gap- Authors and profs didn’t grow up with MySpace or Facebook. Web Communities are foreign to their professional milieu.
    Status Issues- They are used to being the authority and weren’t willing to have their writings publicly challenged. And they have already made their reputation so that they have little status to gain.
    No Financial Benefit- Their time is very valuable and they expect to be paid for their efforts.
    Lack of Passion or Connection with the Project- Community participation is not their avocation nor were they passionate about the topic.
    Those that did participate did so out of a passion for the topic and seemed most motivated by the opportunity to build their reputation within the community. For many members, community participation is one of their hobbies. And they seemed not to desire any remuneration for their contributions.
    Observation- Just like in the early days of the Internet, there is currently more cache attached to eyeballs and recognition than to traditional financial rewards. However, there are significant costs to forming, hosting and moderating communities. And the work of cummunities can be very valuable to companies of all sorts. New business models are emerging that will manage the costs and reflect the value of the contributions.
  2. Given the uneven content and need to bring in a professional author, should anyone even try to write another book by committee?
    It depends on the type of Book!! Wikipedia has demonstrated that this model is very effective in creating a comprehensive reference work. ( I suppose that some purists would argue that Wikipedia isn’t really a book but rather a collection of content modules), For traditional authored book projects, communities might play a valuable role in helping authors research topics that are outside of their primary expertise and in reviewing the authors work for accuracy and clarity.
  3. Will there be instances where community created content modules will compete with traditional published works?
    Given the Google world that we live in, consumers of information often seek a terse answer to a specific question. And there is a definite trend towards the integration of content with the information consumers’ workflows. For these information consumers, A well structured repository of content modules is potentially more valuable than traditional books.
  4. So was the project aiming at the wrong goal?
    Perhaps! Old habits die hard and many people in my generation have books to thank for alot of their professional knowledge. Maybe the goal of the project should have been to develop an outstanding repository of content modules and resources that could become an authorative source of information about communities and their role in changing and enhancing the ways that companies do business. In the long run, the mission critical task is creating outstanding intellectual property. Creating multiple media versions of that IP will allow publishers to reach a wider range of customers.
  5. Will the many-to-many content model put traditional publishers out of business?
    There is much more opportunity than risk for publishers.
    Most of us would agree that we already suffer from information overload. Communities have the potential to raise that overload to an even higher level. Information consumers want to know that the content they are reading is accurate and authoritative. This has been the primary domain of publishers for many years. If publishers find new ways to harness the wisdom of crowds in creating new content and improving existing content, their future is bright. If not, someone else will seize the opportunity. And if they trivialize new methods of content creation as being less pure and authoritative than their time-tested editorial processes, they will face serious consequences. If you’re not convinced, just ask your favorite encyclopedia publisher!!
  6. That is my report on many-to-many versus one-to-many content creation models. Now I’m trying to figure out whether the few-to-few model refers to custom publishing or to an underperforming web community.

It’s the Process, Not the Words: Autodesk Case Study

Leonor summed up Gilbane’s perspective on the real challenge in content globalization in her entry of January 19:

We’ve found that the problem for organizations is less about the act of translation itself, and more about aligning the business processes that support it.

The hard part of globalization isn’t translating one phrase to another. The core problem is the inefficiencies associated with how we do the translating, with how we move words from creation to consumption by their target audience.

Our latest Content Technology Works case study describes how Autodesk, a major software company with worldwide sales of $1.6 billion US, recognized that better processes and higher levels of automation are the critical elements of a scalable globalization strategy. More words in translation memory were important outcomes of its initiatives, but the real benefit to Autodesk is greater competitive advantage as a worldwide software company.

Minette Norman, Senior Software Systems Manager, Worldwide Localization, at Autodesk shares insights in a webinar on April 25, 1:00 pm ET. Registration is now open.

Adobe, IBM, Microsoft and Oracle Executives to Participate in Keynote Panel at Gilbane San Francisco 2007

The Gilbane Group and Lighthouse Seminars announced that executives from Adobe, IBM, Microsoft and Oracle will participate in the Gilbane San Francisco 2007 keynote panel, “Content Technology Industry Update,” on Wednesday, April 11th at 8:30 a.m. at the Palace Hotel. Taking place April 10-12, the Gilbane Conference San Francisco has greatly expanded its collection of educational programs, including sessions focused on web and other enterprise content management applications, enterprise search and information access technologies, publishing technology, wikis, blogs and collaboration tools, and information on globalization and translation technology. The “Content Technology Industry Update” keynote panel will focus on the most important strategic issues technical and business managers need to consider for both near and long term success in managing content and content technologies in the context of enterprise applications. The keynote panel discussion is completely interactive (i.e., no presentations). With six tracks and 35 sessions to choose from, attendees have the opportunity to participate in a conference program focused on educating attendees about the latest content management technologies from experienced content management practitioners, consultants, and technologists. http://gilbanesf.com/conference_grid.html

Public Alpha of Apollo Debuts on Adobe Labs

Adobe Systems Incorporated (Nasdaq: ADBE) announced that the first public alpha version of Apollo is now available for developers on Adobe Labs. Apollo is the code name for a cross-operating system application runtime that allows web developers to leverage their existing skills in HTML, JavaScript and Ajax, as well as Adobe Flash and Adobe Flex software to build and deploy rich Internet applications (RIAs) on the desktop. Apollo provides people with direct access to Internet applications built with HTML, JavaScript, Flash and PDF without the need to open a browser, offering more reliable interaction with content. With Apollo, people can launch applications directly from their desktops and interact with them offline. When a network connection is available, newly created or changed content can synchronize. The first version of Apollo for developers includes a free SDK that provides a set of command line tools for developing and working with Apollo applications. Web developers can use the Integrated Development Environment (IDE) of their choice, including Adobe tools such as Eclipse-based Flex Builder, Flash, and Dreamweaver to build Apollo applications. The alpha version of the Apollo application runtime, required to run Apollo applications, and the Apollo SDK are available immediately as free downloads from Adobe Labs. The Apollo SDK is available in English. The Apollo runtime and SDK are offered for both Windows and Macintosh operating systems, and future versions will be available for Linux. http://www.adobe.com/go/apollo

Good Books, 5 Ways

Is the Caravan Project the right new distribution model for trade publishers? The basic offering is compelling–providing simultaneous access to print, print-on-demand, eBook, chapter eBook, and digital audio versions of titles. The Caravan Project’s publishers include university presses like Yale University Press and nonprofit publishers like Beacon Press.

The Washington Post has a very good and comprehensive article about the project and its executive director, Peter Osnos.

Osnos, a fast-talking, silver-haired man of 63, has been in publishing almost precisely as long as as Politics and Prose has been in business. He left The Washington Post, where he’d been a reporter and editor, for Random House in 1984. Ten years ago he founded Public Affairs, which specializes in the kind of serious nonfiction titles that don’t require six-figure advances to acquire.

Over the years, he became all too familiar with the chief bane of a moderate-size publisher’s existence: the difficulty of getting the right number of books into bookstores at the right time. The advent of digital books, along with greatly improved print-on-demand technology, seemed to offer new ways to address this distribution problem, so a couple of years ago, after stepping down as head honcho at Public Affairs, he began to wrestle with it independently.

The nonprofit Caravan Project — which is supported by the MacArthur, Carnegie and Century foundations — is the result.

To start the experiment, Osnos recruited seven nonprofit publishers, among them academic presses such as Yale and the University of California and independents such as the Washington-based Island Press. Each was to designate titles on its spring 2007 list that would be published in a number of formats simultaneously.

The intriguing idea, to me, of the Caravan Project, is that it is directed at bookstores, with a goal of providing a common platform for them to sell the various formats. The marriage of print distribution with POD is a natural one of course–and Ingram, which is the backbone of the Caravan Project has exactly the infrastructure for that. But adding the eBooks and digital audio is distinctly different, and it gives booksellers the opportunity to be the human conduit for this kind of buying. The potential here is to give booksellers an enormous inventory of product where potentially nothing is truly out of stock.

Of course, the Caravan Project is a finite effort, with seven publishers providing a subset of their current catalogs, but the goal of the project is to try the new model, and see how it impacts the business. According to the Post, Borders sees the potential. “This could be a pilot for what all publishers end up doing eventually,” agrees Tom Dwyer, director of merchandising at Borders. Right now, Dwyer adds, bigger publishers are mainly focused on ‘digitizing all their content.’ But when it comes to distribution, he says, he’s sure they’re “planning something in this direction.”

I think they are too. I blogged about the eBook widget wars recently over at my own blog. The real story there is not the widgets themselves, but the mechanisms for digitization, access, and distribution behind those widgets. Project Caravan is an interesting effort, and one that publishers should watch closely.

Trying to Take the High Road

My last blog was in reaction to two recent vendor experiences. One had just briefed me on an enterprise search offering; the other had been ignoring my client’s efforts to get software support, training and respond to bug reports. The second blogged a reaction with a patronizing: “So Lynda should not feel too bad. I know its (sic) frustrating to deal with vendors but not all vendors are the same and she certainly hasn’t tried us all.”

With dozens of vendors offering search tools, it was fair to assume that I haven’t tried them all. However, having used search engines of all types since 1974 both as a researcher and analyst I have a pretty good sense of what’s out there. Having evaluated products for clients, and for embedded use in products I brought to market for over 20 years, it doesn’t take me long with a new product to figure out where the problems are. I also talk to a lot of vendors, search users, and read more reports and evaluations than I can count. The evidence about any one product’s strengths and weaknesses piles up pretty quickly. “Searching” for stuff about search has been my career and I do make it my business to keep score on products.

I’m going to continue to hold my counsel on naming different search tools that I’ve experienced for the time being. Instead, in this blog I’ll focus on keeping buyers informed about search technologies in general. My work as a consultant is about helping specific clients look at the best and most appropriate options for the search problems they are trying to solve and to help guide their selection process. Here is some quick generic guidance on making your first search tool choice:

  • If you have not previously deployed an enterprise search solution in your domain for the corpus of content you plan to search, do not begin with the highest priced licenses. They are often also the most costly and lengthy implementations and it will take many months to know if a solution will work for you over the long haul.
  • Do begin with one or more low cost solutions to learn about search, search product administration, and search engine tuning. This helps you discover what issues and problems are likely to arise, and it will inform you about what to expect (or want) in a more sophisticated solution. You may even discover that a lower cost solution will do just fine for the intended application.
  • Do execute hundreds of searches yourself on a corpus of content with which you are very familiar. You want to learn if you can actually find all the content you know is there, and how the results are returned and displayed.
  • Do have a variety of types of potential searchers test-drive the installed product over a period of time, review the search logs to get a sense of how they approach searching; then debrief them about their experiences, and whether their search expectations were met.

It is highly unlikely that the first enterprise search product you procure will be the best and final choice. Experience will give you a much better handle on the next selection. It is certainly true that not all vendors or products are the same but you need to do serious reality-based evaluations to learn your most important differentiators.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑