Curated for content, computing, and digital experience professionals

Month: March 2007 (Page 1 of 5)

Adobe Analyst Meetings

Frank, Mary, Tony, and I attended the Adobe analyst meetings in New York this week. To say that Adobe had a lot to share would be an understatement, but this was my first time attending this kind of event, so I have nothing to compare it to. Having said that, I have to say that I was impressed with the progress Adobe has made in several key areas–bringing Macromedia and its products into the fold; building out a much more compelling offering and clearer message with LiveCycle; and further solidifying its commanding presence in the creative tools space.

Along with the major focus on Creative Suite 3 (which was announced on Monday) and LiveCycle, Adobe executives also spent a fair bit of time on Software as a Service, discussing offerings like Adobe Document Center, which I have been playing around with since yesterday morning, and Acrobat Connect, nee Breeze, their web conferencing software. (And without having evaluated Acrobat Connect in detail, I have to say as someone who is on Webinars all the time that Connect is by far the easiest product I have ever used. It also seems to load like any other URL, but perhaps that is because I already have Acrobat professional on my system–not sure.)

A few other things that caught my eye:

  • Kuler is, well, cool. It is a collaborative online application that allows users to discover and upload color themes that can be used with Creative Suite tools. Ryan Stewart has a nice writeup over at ZDNet.
  • Apollo is impressive. We saw a number of demos, including the eBay one that has been written about (see here, and you can see a video of a demo here. The coolest Apollo demo, by far, is the one you can’t currently see, the Buzzword word processor from Virtual Ubiquity; their Website will tell you they have gone back underground after the alpha release. Again, Ryan Stewart has a nice overview and screen shots over at ZDNet. And there seems to be uptake for Apollo in the broad developer community. According to CTO Kevin Lynch, as of Wednesday the 28th, 30,000 people had downloaded the client since it was posted on March 19.
  • There’s a new Beta of Acrobat 3D available for download. I have looked at the manufacturing space a fair bit over the last couple of years, and few areas seem to have more areas of meaningful technical interchange than do manufacturers, their suppliers, and their customers. PDF files are everywhere in these applications, so a more functional 3D Acrobat makes all the sense in the world.
  • Adobe’s efforts to be more active in the standards world with PDF are clearly paying off. While we were there, they announced a win with the mortgage industry’s MISMO standards.

Lots to digest, but I came away impressed.

More on “engage and collaborate” vs. “command and control”

In response to a semi-rhetorical question I posed in my post on Enterprise 2.0 research last week, Niall Cook comments:

You ask: “…what will be lost or gained in the process of force-fitting the “engage and collaborate” functions and culture into the “command and control” of top-down IT directives?”

Simple. The users.

Well, yes, but it is more complex than that. Just as there are good and not-so-good uses of, e.g., wikis (or any technology of course) in enterprises, there are also good and not-so-good uses of policies, procedures, and organizational structures in enterprises. While I agree that there is usually way too much command and control, there are situations where it is just what you want (nuclear plant safety procedures, etc.). We are in the early days yet of figuring out where and how all these 2.0 technologies can be usefully applied, and what corporate culture changes will result.

Part of the debate is continuing with a bit of back and forth between Andrew McAfee and Tom Davenport.

Google and Microsoft debate Enterprise Search in keynote at Gilbane San Francisco

Join us on April 11, 8:30 am at the Palace Hotel in San Francisco for Gilbane San Francisco 2007

We have expanded our opening keynote to include a special debate between Microsoft and Google on Enterprise Search and Information Access, in addition to our discussion on all content technologies with IBM, Oracle & Adobe.

You still have time to join us for this important and lively debate at the Palace Hotel, April 11. The keynote is open to all attendees, even those only planning to visit the technology showcase. The full keynote runs from 8:30am to 10:15am followed by a coffee break and the opening of the technology showcase, and now includes:

Keynote Panel: Content Technology Industry Update PART 2
Google and Microsoft are competing in many areas on many levels. One area which both are ramping-up quickly is enterprise search. In this part of the opening keynote, we bring the senior product managers face to face to answer our questions about their plans and what this means for enterprise information access and content management strategies.

Moderator: Frank Gilbane, Conference Chair, CEO, Gilbane Group, Inc.
Jared Spataro, Group Product Manager, Enterprise Search, Microsoft
Nitin Mangtani, Lead Product Manager, Google Search Appliance, Google

See the complete keynote description.

Gilbane San Francisco 2007
Content management, enterprise search, localization, collaboration, wikis, publishing …
Complete conference information is at

Adobe Announces Adobe Creative Suite 3

Adobe Systems Incorporated (Nasdaq:ADBE) announced the Adobe Creative Suite 3 product line. Adobe’s new Creative Suite 3 line-up unites Adobe and Macromedia products to provide designers and developers with options for all facets of print, web, mobile, interactive, film, and video production. There are six all-new configurations of Adobe Creative Suite 3. These include, Creative Suite 3 Design Premium and Design Standard editions; Creative Suite 3 Web Premium and Web Standard editions; and Creative Suite 3 Production Premium; and, Creative Suite Master Collection which combines 12 of Adobe’s new design and development applications in a single box. The majority of Adobe Creative Suite 3 editions will be available as Universal applications for both PowerPC and Intel-based Macs and support Microsoft Windows XP and Windows Vista. Customers will experience increased levels of performance and speed running Creative Suite 3 natively on Intel-based Macintosh systems and the latest Windows hardware. Customers can choose from six all-new suites or full version upgrades of 13 stand-alone applications, including Photoshop CS3, Photoshop CS3 Extended, InDesign CS3, Illustrator CS3, Flash CS3 Professional, Dreamweaver CS3, Adobe Premiere Pro CS3, and After Effects CS3. Each edition of Adobe Creative Suite 3 integrates different configurations of Adobe’s creative products: Adobe Creative Suite 3 Design Premium delivers an essential toolkit for print, web, interactive and mobile design while Adobe Creative Suite 3 Design Standard focuses on professional print design and production. Adobe Creative Suite 3 Web Premium combines the web design and development tools and Adobe Creative Suite 3 Web Standard serves the professional web developer. Adobe Creative Suite 3 Production Premium is a post-production solution for video professionals. Lastly, Adobe Creative Suite 3 Master Collection combines 12 new creative applications in one box, enabling customers to design across all media – print, web, interactive, mobile, video and film. Creative Suite 3 Design Premium and Standard, and Creative Suite 3 Web Premium and Standard will begin shipping in April 2007. Creative Suite 3 Production Premium and Creative Suite 3 Master Collection for Mac OS X on Intel-based systems and for Microsoft Windows XP and Windows Vista platforms will begin shipping worldwide in the third quarter of 2007. Estimated street price for the Creative Suite 3 Design Premium is US$1799, US$1599 for Creative Suite 3 Web Premium, US$1699 for Creative Suite 3 Production Premium, and US$2499 for Creative Suite 3 Master Collection. There are upgrade paths available for customers.

What is Under the Hood?

Last week I began this entry, re-considered how to make the point and tucked it away. Today I unearthed an article I had not gotten around to putting into my database of interesting and useful citations. Lisa Nadile in The ABCs of Search Engine Marketing, in CIO Magazine, hits the nail on the head with this statement, “Each search engine has its own top-secret algorithm to analyze this data…” This is tongue in cheek so you need to read the whole article to get the humor. Ms. Nadile’s article is geared to Internet marketing but the comments about search engines are just a relevant for enterprise search.

I may be an enterprise search analyst but there are a lot of things I don’t know about the guts of current commercial search tools. Some things I could know if I am willing to spend months studying patents and expensive reports, while other things are protected as trade secrets. I will never know what is under the hood of most products. Thirty years ago I knew a lot about relatively simple concepts like b-tree indexes and hierarchical, relational, networked and associative data structures for products I used and developed.

My focus has shifted to results and usability. My client has to be able to find all the content in their content repository or crawled site. If not, it had better be easy to discover why, and simple to take corrective actions with the search engine’s administration tools, if that is where the problem lies. If the scope of the corpus of content to be searched is likely to grow to hundreds of thousands of documents, I also care about hardware resource requirements and performance (speed) and scalability. And, if you have read previous entries, you already know that I care a lot about service and business relationships with the vendor because that is crucial to long term success. No amount of “whiz bang” technology will overcome a lousy client/vendor relationship.

Finding out what is going on under the hood with some imponderable algorithms isn’t really going to do me or my client any good when evaluating search products. Either the search tool finds stuff the way my client wants to find it, or it doesn’t. “Black art,” trade secret or “patent protected” few of us would really understand the secret sauce anyway.

Is Adobe Really Going to Mars?

I’m becoming concerned whether Adobe is really serious about Mars. My evidence:

1. The FAQ has not be updated since 27 Oct 2006.

2. At the end of the FAQ it reads:

Q: When will the Mars format be frozen for 1.0?
A: A date for this has not yet been set.
Q: When will the Mars plug-in be available?
A: It is planned to be available before the end of the year.

This all seems very tentative.

As Joe Wilcox observes here:
“Adobe’s competitive response to XPS makes sense. PDF’s heritage predates the populist Web, and Adobe created the format for the purpose of mimicking paper documents. In the 21st century, however, digital documents are often containers that likely will never be printed. Paper’s relevance — and so the need to mimic — has greatly diminished.”
All of this is, on the surface, true (except perhaps the “makes sense” part). Does tossing high-fidelity page-oriented PDF into an XML container really address this issue?

I think even more significant is Adobe’s clearly stated, and obviously honestly intended, design to make PDF an ISO standard. My cynical blog entry is here: . But, as Adobe and others point out, subsets of PDF, very useful subsets I’d say, are already ISO standards, including “Several trade-specific subsets of the PDF spec are either ISO approved or in the approval process, including PDF/X for printers, PDF/E for engineers, PDF/A for archivists and PDF/UA for making documents compliant with Section 508 regulations.” (,1895,2088277,00.asp).

These aspects of PDF, including, of course, the entire spec, only serve to “encrypt” as standards that which makes PDF uniquely the “Portable Document Format.” PDF as an XPS competitor is not uniquely PDF, in the historic sense of the format, and with the entire PDF spec submitted to ISO, Mars needs to succeed within this process. Do we really think it can?

I think that Adobe is far more interested in Apollo ( Although this is a very different beast than Mars, I believe Adobe knows that its future lies much moreso with this kind of technology than it does on the not-very-hospitable planet Mars.

New Research on Enterprise Social Software Use

Finally there is some quantitative research on enterprise use of blogs, wikis, tagging, etc. to complement the very informal surveys we have taken, and the work done at the University of Massachusetts. Reports from Forrester (CIOs Want Suites For Web 2.0) and McKinsey (How businesses are using Web 2.0: A McKinsey Global Survey) published this week provide interesting, though not surprising, data. The McKinsey report is free with registration, and the Forrester report isn’t expensive.

I haven’t read the Forrester report (119 CIOs), but the executive summary focuses on their finding that most CIOs want to buy enterprise social software in suite form from large vendors rather from the smaller specialist software vendors. This fact itself is of course totally predictable, but it raises two interesting issues. First, just what are all the larger vendors, as well as midsize (e.g., content management vendors) doing about all this? (Short answer – all are doing something, but the details are often vague.) Second, what will be lost or gained in the process of force-fitting the “engage and collaborate” functions and culture into the “command and control” (last week’s post) of top-down IT directives?

The McKinsey report (2847 executives, 44% C-level) found “widespread but careful interest” in “Web 2.0 technologies”, and that they are strategic and will be invested in. I think their conclusion might be a little overly conservative given their findings. For example, 77% of retail and 74% of high tech plan to increase investment in these technologies. Note, however that McKinsey includes web services as a “Web 2.0” technology which not everyone would agree with.

See comments on these reports from Nick Carr, who points out where the Forrester and McKinsey findings differ. And see Richard MacManus’ comments on what the Forrester findings mean for the startups in this space.
For a couple of vendor perspectives, Socialtexts’ Ross Mayfield covers these findings here, and FAST’s Hadley Reynolds talks about some similar research they have been working on with the Economist here.
Also (while not commenting on these reports) Andrew McAfee provides some info on how he is seeing enterprises using these technologies.

We Are Smarter Than Me– Report

Last fall, Martin Clifford-CEO of the web community juggernaut, informed me that I was hopelessly out of date regarding the phenomena of web communities and hinted that due to my advanced years I might never comprehend the impact of many-to-many publishing. It’s true that most of my experience is in traditional forms of one-to-many publishing. However, I’ve always loved a good challenge so I began my exploration of the role of communities in the creation of content.. Early in my explorations, I came across the We Are Smarter Than Me project. This project is the joint effort of Pearson Educational Publishing, Wharton, MIT, and Shared Insights. The goal was to form a community that would write a book about how communities could change and enhance the way that companies do business. I tuned into the “Buzz” to get a sense of the passion of the participants And then, I joined the community and contributed a small section on the importance of word-of-mouth in the marketing of services. As the project progressed, I watched its progress and waited eagerly to see what would happen when the many-to-many model was invoked to produce a traditional business book.

To hear first hand accounts of the project, I travelled to the Community 2.0 conference in Las Vegas. Barry Libert of Shared Insights and Tim Moore of Pearson Educational Publishing presented a fascinating progress report and a conversation with co-founder Jon Spector (soon to be CEO of the Conference Board) filled in some additional information.

The participants are to be congratulated for commissioning the project as a pure experiment. As Mr. Moore said, “I just wanted to see what would happen” As one might imagine, the interaction between web communities and large esteemed institutions presented some interesting challenges. Not surprisingly, the first significant issue arose when Pearson faxed their contract to Shared Insights. While the contract was entirely appropriate for traditional author teams, indemnification clauses took on entirely new meaning when the work of hundreds or thousands of author/contributors would be scrutinized. The prolonged wrangling broke the project’s early momentum. It was assumed that the Academic Dream Team of Pearson’s business authors and the faculties of Wharton and MIT would produce numerous thoughtfully written content modules. Surprisingly, none of the authors or profs chose to participate in the project. The project team reverted to Plan B by sending participation invitations to a large list of people affiliated with the sponsoring institutions. The response was enthusiastic and the community began to grow. Current membership is approximately 3500 with 650 individual wiki posts.

As the active participation increased, the project team learned another important lesson. Suddenly the community wanted to take over the project leadership and asked the project team to step aside. Even though the project team knew alot about community dynamics, they weren’t ready for their own community to be so assertive and found it difficult to relinquish control. When they did step back, the community flourished.
How did the book by community turn out? One speaker reported that the journey was more interesting than the destination meaning that the content created was plentiful but uneven in quality and style. To yield an acceptable business book, it would be necessary to hire an accomplished professional author who would also handle the fact checking process.

The open questions and lessons learned from this project.

  1. Why didn’t the authors and professors participate?
    Possible explanations included:
    Generation Gap- Authors and profs didn’t grow up with MySpace or Facebook. Web Communities are foreign to their professional milieu.
    Status Issues- They are used to being the authority and weren’t willing to have their writings publicly challenged. And they have already made their reputation so that they have little status to gain.
    No Financial Benefit- Their time is very valuable and they expect to be paid for their efforts.
    Lack of Passion or Connection with the Project- Community participation is not their avocation nor were they passionate about the topic.
    Those that did participate did so out of a passion for the topic and seemed most motivated by the opportunity to build their reputation within the community. For many members, community participation is one of their hobbies. And they seemed not to desire any remuneration for their contributions.
    Observation- Just like in the early days of the Internet, there is currently more cache attached to eyeballs and recognition than to traditional financial rewards. However, there are significant costs to forming, hosting and moderating communities. And the work of cummunities can be very valuable to companies of all sorts. New business models are emerging that will manage the costs and reflect the value of the contributions.
  2. Given the uneven content and need to bring in a professional author, should anyone even try to write another book by committee?
    It depends on the type of Book!! Wikipedia has demonstrated that this model is very effective in creating a comprehensive reference work. ( I suppose that some purists would argue that Wikipedia isn’t really a book but rather a collection of content modules), For traditional authored book projects, communities might play a valuable role in helping authors research topics that are outside of their primary expertise and in reviewing the authors work for accuracy and clarity.
  3. Will there be instances where community created content modules will compete with traditional published works?
    Given the Google world that we live in, consumers of information often seek a terse answer to a specific question. And there is a definite trend towards the integration of content with the information consumers’ workflows. For these information consumers, A well structured repository of content modules is potentially more valuable than traditional books.
  4. So was the project aiming at the wrong goal?
    Perhaps! Old habits die hard and many people in my generation have books to thank for alot of their professional knowledge. Maybe the goal of the project should have been to develop an outstanding repository of content modules and resources that could become an authorative source of information about communities and their role in changing and enhancing the ways that companies do business. In the long run, the mission critical task is creating outstanding intellectual property. Creating multiple media versions of that IP will allow publishers to reach a wider range of customers.
  5. Will the many-to-many content model put traditional publishers out of business?
    There is much more opportunity than risk for publishers.
    Most of us would agree that we already suffer from information overload. Communities have the potential to raise that overload to an even higher level. Information consumers want to know that the content they are reading is accurate and authoritative. This has been the primary domain of publishers for many years. If publishers find new ways to harness the wisdom of crowds in creating new content and improving existing content, their future is bright. If not, someone else will seize the opportunity. And if they trivialize new methods of content creation as being less pure and authoritative than their time-tested editorial processes, they will face serious consequences. If you’re not convinced, just ask your favorite encyclopedia publisher!!
  6. That is my report on many-to-many versus one-to-many content creation models. Now I’m trying to figure out whether the few-to-few model refers to custom publishing or to an underperforming web community.
« Older posts

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑