Curated for content, computing, and digital experience professionals

Month: November 2006 (Page 2 of 3)

What’s Wrong with Web 2.0

In a word, “expectations”. There is nothing wrong with the moniker itself, but when used as if it were a thing-in-itself, as something concrete, it inevitably becomes misleading. This is not something to solely blame on marketing hype – people crave simple labels, marketers are just accommodating us. We need to take a little responsibility for asking what such labels really mean. When forced to reduce Web 2.0 to something real, you end up with AJAX. There is also nothing wrong with AJAX or its components. The problem is overestimating what it can do for us.

Bill Thompson’s post “Web 2.0 and Tim O’Reilly as Marshal Tito” yesterday on The Register’s Developer site, is perhaps a little overstated, but is useful reading for VCs and IT strategists. Here’s a sample:

Web 2.0 marks the dictatorship of the presentation layer, a triumph of appearance over architecture that any good computer scientist should immediately dismiss as unsustainable. … Ajax is touted as the answer for developers who want to offer users a richer client experience without having to go the trouble of writing a real application, but if the long term goal is to turn the network from a series of tubes connecting clients and servers into a distributed computing environment then we cannot rely on Javascript and XML since they do not offer the stability, scalability or effective resource discovery that we need.

Next Week’s Gilbane Conference & Events Blog

You may have noticed we haven’t posted here as much about Gilbane Boston as we have about our previous events. That’s because we now have a special Events Blog. Although we will still mention important announcements here, we plan to keep posts on this blog focused on analysis and industry discussions, which may include opinions on conference sessions etc. The Events Blog, on the other hand, is set-up so that any of our conference team can post anything, including our event announcements, press releases, deadlines, discount offers, and all other promotional news. Other useful permanent event links are our Conference Overview page, and our Speaker Submission and Guidelines page. See the latest post on Gilbane Boston at

Why CMS Professionals should care about composition

On August 15, 2006 another Gilbane blogger, Rita Warren, queried whether a marriage between CMS and CRM made sense. “Circa 1996… it was all about one-to-one customer communications. That (broad) vision was apparently too hard to realize back then! Maybe it’s possible now.”

Well, circa 2006 it’s still all about one-to-one, but I think we understand what that means a lot better. One-to-one customer communications are not only possible, but they are happening in many small and large businesses. In most cases they are not coming from major CRM implementations ala Siebel – they are coming from composition tools. Many composition tool vendors now refer to themselves as personalization or customer communications management products. If you look at some of the case studies from the composition vendors included in my last entry you will find case studies for communications such as statements, enrollment books, and invoices that tailor messaging, educational content, product content, document format and delivery channel based on customer data or stated preferences.

Okay – so as a CMS professional why should you care about composition tools? Several reasons:

  • Personalization is a beast that feeds on content. Lots and lots of content. Many composition experts have never even heard of taxonomy – CMS architects needed!
  • Many composition tools have rudimentary content capabilities – but integration with “real” content management tools is necessary to feed the beast – CMS integrators needed!
  • High volume composition tools are getting to the point where they can serve printed and electronic transactional channels equally well and are starting to move upstream into driving personalized web content. CMS and composition tools are not on the same path – visionaries needed!

It’s only a matter of time before some of the composition vendors decide that they should be in the CMS business. Personally, I think that trying to tightly couple those capabilities with composition would be a bad idea. Composition tools are complicated enough as it is. CMS vendors who have been trying to deliver the holy grail of print and web content management across document types are still not there yet. I find it hard to believe that a composition solution would leapfrog over the current CMS vendors. I suppose this is one instance where it would be nice to be proved wrong.

Meanwhile, an easier path to integrating current CMS technology for managing web and print content with leading high-volume composition tools would be welcome. Document Sciences has worked with Documentum and a few others. GMC Software has partnered with Interwoven a couple of times and Exstream and Metavante have both partnered with IBM OnDemand. I have also seen a number of Exstream – Vignette combos. Few vendors have broad and established content management partnerships and the market is ripe for this kind of collaboration.

Content Globalization Workflows: Struggling or Streamlining?

In preparation for our panel on Content Globalization Workflows on Thursday November 30th at our Boston conference, we have created a survey to gauge how organizations are dealing with increasing market demand for localized content.

We hope to see you at this session. But whether you join us or not, contribute to it by answering our survey questions. We’ll publish the results in a blog entry after the conference, including the results from our audience survey. Give us your input and you’ll be eligible to win a free conference pass for one of our future conferences!

Here is a short URL to the survey you can share with others:

Here’s what we’d like to know:

  1. Which issue is your most pressing business driver for providing localized content to your customers?
  2. Who is responsible for purchasing translation software in your organization?
  3. What is the most difficult challenge within your localization processes?
  4. Do you have one or more content/document management systems in house?
  5. Do you have one or more translation management systems in house?
  6. If you do not have a translation management system in house, who do you work with to manage your translation processes?
  7. If you have both a content/document and a translation management system in house, are they integrated?
  8. If the systems are integrated, select the most appropriate description of the integration.

Web 2.0, 3.0 and so on

The recent Web 2.0 conference predictably accelerated some prognostication on Web 3.0. I don’t think these labels are very interesting in themselves, but I do admit that the conversations about what they might be, if they had a meaningful existence, expose some interesting ideas. Unfortunately, they (both the labels and the conversations) also tend to generate a lot of over-excitement and unrealistic expectations, both in terms of financial investment and doomed IT strategies. Dan Farber does his usual great job of collecting some of the thoughts on the recent discussion in “Web 2.0 isn’t dead, but Web 3.0 is bubbling up“.

One of the articles Dan links to is a New York Times article by John Markoff, where John basically equates Web 3.0 with the Semantic Web. Maybe that’s his way of saying very subtly that there will never be a Web 3.0? No, he is more optimistic. Dan also links to Nick Carr’s post welcoming Web 3.0, but even Carr is gentler that he should be.

But here’s the basic problem with the Semantic Web – it involves semantics. Semantics are not static, language is not static, science is not static. Even more, rules are not static either, but at least in some cases, syntax, and logical systems have longer shelf lives.

Now, you can force a set of semantics to be static and enforce their use – you can invent little worlds and knowledge domains where you control everything, but there will always be competition. That’s how humans work, and that is how science works as far as we can tell. Humans will break both rules and meanings. And although the Semantic Web is about computers as much (or more) than about humans, the more human-like we make computers, the more they will break rules and change meanings and invent their own little worlds.

This is not to say that the goal of a Semantic Web hasn’t and won’t generate some good ideas and useful applications and technologies – RDF itself is pretty neat. Vision is a good thing, but vision and near-term reality require different behavior and belief systems.

Content Management Debate in 2 Weeks

In preparation for our opening keynote at our Boston conference in a couple of weeks, we have created a survey to help fuel the debate on the future of content management. The list below is what we are starting with. Of course we won’t have time to address all these questions. That is why we want you to tell us which of them are the most interesting to you. To vote, simply go to the online survey. There is also a spot on the survey where you can add questions we haven’t listed. One lucky voter will win a free conference pass to one of our future events. By the way, you don’t have to attend the event to vote, and we, and others, will be blogging about the results of the questions.
Here is a short URL to the survey you can share with others:

  1. What are the top 3 technologies that must be considered in any content management strategies in the next 12-24 months?
  2. How will the new SharePoint Server’s CM capability affect the CM market?
  3. Are search "platforms" going to replace CMSs as the primary user entrance to content repositories?
  4. Is there such a thing as "Web 2.0", is there a Content Management 2.0"? If so, what are they?
  5. How will Blog and Wiki tools be used in enterprise content applications? How are they being used today?
  6. Does social software tagging or "folksonomy" have a role to play in enterprise content applications?
  7. What is the number one advantage, and the number one disadvantage of each of the approaches represented on the panel (ECM suite, CM application, infrastructure CM, hosted CM, open source CM)?
  8. How is widespread adoption of RSS/Atom going to affect content delivery? And what does this mean to enterprise content management or publishing strategies?
  9. If we had this same panel next year, which of the companies on the panel would not be here? Why?
  10. Which other technologies associated with Vista and Office 2007 are important for enterprise content or publishing applications?
  11. Are there authoring tools on the horizon that are both user-friendly and capable of authoring for both electronic and print output?
  12. How will Oracle’s acquisition of Stellent change Oracle’s approach to CM solutions and their relationship with their CM partners?
  13. What is the future of software as a service, and is it appropriate for enterprise content applications like content management, authoring, etc.?
  14. How are translation and localization requirements affecting content management strategies and what changes in technology and strategic direction can we expect in next year or two?
  15. Is there any breakthrough search technology on the horizon that will affect intranet or extranet applications in the next 18 months?
  16. How will the tension between content control and collaboration be resolved? Or will it?
  17. Are there any breakthrough classification or metadata tagging technologies on the horizon that will significantly impact content management strategies?
  18. Is there a future for stand-alone BPM products? Or will they be integrated into ECM and other enterprise applications?
  19. What infrastructure technololgies might we see in the next 18 months that will affect enterprise content applications?

Cast your vote!

SEC to Ease SOX Reporting

A story on page 1 of the Nov 10 Wall Street Journal reports that the SEC is re-evaluting interpretation of Section 404 of the SOX rule, which dictates internal review and external auditing of financial reporting systems.
Citing pressure from “the nation’s business lobby,” the SEC is taking steps to allow a “more flexible reading,” and intends to “propose guidance . . . to help companies and auditors interpret Section 404 in a way likely to save them time and money.” The new guidance is expected next month.
See “Business Wins Its Battle to Ease A Costly Sarbanes-Oxley Rule” for details.

Managing CM Projects: There must be a better way!

I know I’m not alone here when I say that content management projects are HARD to do right. I’ve been doing a lot of thinking about this subject – particularly around the project management aspect of it. All is rosy at the beginning when you put together a master project plan, but it seems that within a very short time, you’re heading towards scope creep, over budget, way over schedule, and stress! There must be a better way.

Over the span of my career (the last 15 years or so), a project manager has become standard fare for any corporate IT project. As I’m sure many of you recall, however, there was a time when having a person dedicated as a “Project Manager” was seen as an unnecessary frill. Thankfully, that’s changed (in most places).

When you embark on something as complex as a major content management initiative, not only do you need a dedicated project manager, you need a highly skilled and experienced project manager. That’s another issue.

The good news is that organizations like the Project Management Institute (PMI) have their PMP certification, which I believe sets a really good foundation for project managers. But even PMP certified individuals who have managed other types of system implementations or software development projects may struggle with a content project.

So what’s different about content management projects?

Continue reading

« Older posts Newer posts »

© 2023 The Gilbane Advisor

Theme by Anders NorenUp ↑