Curated for content, computing, and digital experience professionals

Year: 2009 (Page 15 of 39)

Ontopia 5.0.0 Released

The first open source version of Ontopia has been released, which you can download from Google Code. This is the same product as the old commercial Ontopia Knowledge Suite, but with an open source license, and with the old license key restrictions removed. The new version has been created by not just by the Bouvet employees who have always worked on the product, but also by open source volunteers. In addition to bug fixes and minor changes, the main new features in this version are: Support for TMAPI 2.0; The new tolog optimizer; The new TologSpy tolog query profiler; The net.ontopia.topicmaps.utils. QNameRegistry and QNameLookup classes have been added, providing  lookup of topics using qnames; Ontopia now uses the Simple Logging Facade for Java (SLF4J), which makes it easier to switch logging engines, if desired. http://www.ontopia.net/

Digital Platforms and Technologies for Publishers: Implementations Beyond “eBook”

We are very happy to announce that we’ve published our new report, Digital Platforms and Technologies for Publishers: Implementations Beyond “eBook.” The 142 page report is available at no charge for download here.

From the Introduction:

Much has changed since we decided to write a comprehensive study on the digital book publishing industry. The landscape has changed rapidly during the past months and we have tried to reflect as many of these changes as possible in the final version of our report. For example:

  • Sales of eBooks finally reached their inflection point in late 2008.
  • Customer acceptance of digital reading platforms such including dedicated reading devices like the Kindle and the Sony Reader and mobile devices like the iPhone and the BlackBerry have helped accelerate the market for digital products.
  • The Google settlement, once finally approved by the courts, will substantially increase the supply of titles available in digital formats.
  • New publishing technologies and planning processes are enabling publishers and authors to create digital products that have their own set of features that take full advantage of the digital media and platforms. Embedded context-sensitive search and the incorporation of rich media are two important examples.
  • Readers are self-organizing into reading communities and sharing their critiques and suggestions about which books their fellow readers should consider. This is creating a major new channel for authors and publishers to exploit.
  • Print-on-demand and short-run printing continue to make significant advances in quality and their costs per unit are dropping. These developments are changing the economics of publishing and are enabling publishers to publish books that would have been too risky in the previous economic model.
  • Lower publishing and channel costs are making it possible for publishers to offer their digital titles at lower prices. This represents greater value for readers and fair compensation for all stakeholders in the publishing enterprise.

We are privileged to report such a fine collection of best practices. And we are thankful that so many smart people were willing to share their perspectives and vision with us and our readers. We thank our sponsors for their ardent and patient support and hope that the final product will prove worth the many hours that went into its preparation.

We encourage readers of this report to contact us with their feedback and questions. We will be pleased to respond and try to help you find solutions to your own digital publishing challenges!

Go With the (Enterprise 2.0 Adoption) Flow

People may be generally characterized as one of the following: optimists, realists, or pessimists. We all know the standard scenario used to illustrate these stereotypes.

Optimists look at the glass and say that it is partially full. Pessimists remark that the glass is mostly empty. Realists note that there is liquid in the glass and make no value judgment about the level.

The global Enterprise 2.0 community features the same types of individuals. I hear them speak and read their prose daily, noticing the differences in the way that they characterize the current state of the E2.0 movement. E2.0 evangelists (optimists) trumpet that the movement is revolutionary. Doubters proclaim that E2.0 will ultimately fail for many of the same reasons that earlier attempts to improve organizational collaboration did. Realists observe events within the E2.0 movement, but don’t predict its success or demise.

All opinions should be heard and considered, to be sure. In some ways, the position of the realist is ideal, but it lacks the spark needed to create forward, positive momentum for E2.0 adoption or to kill it. A different perspective is what is missing in the current debate regarding the health of the E2.0 movement.

Consider again the picture of the glass of liquid and the stereotypical reactions people have to it. Note that none of those reactions considers flow. Is the level of liquid in the glass rising or falling?

Now apply the flow question to the E2.0 movement. Is it gaining believers or is it losing followers? Isn’t that net adoption metric the one that really matters, as opposed to individual opinions, based on static views of the market, about the success or failure of the E2.0 movement to-date?

The E2.0 community needs to gather more quantitative data regarding E2.0 adoption in order to properly access the health of the movement. Until that happens, the current, meaningless debate over the state of E2.0 will continue. The effect of that wrangling will be neither positive or negative — net adoption will show little gain —  as more conservative adopters continue to sit on the sideline, waiting for the debate to end.

Anecdotal evidence suggests that E2.0 adoption is increasing, albeit slowly. The surest way to accelerate E2.0 adoption is to go with the flow — to measure and publicize increases in the number of organizations using social software to address tangible business problems. Published E2.0 case studies are great, but until more of those are available, simply citing the increase in the number of organizations deploying E2.0 software should suffice to move laggards off the sideline and on to the playing field.

It Takes Work to Get Good-to-Great Enterprise Search

It takes patience, knowledge and analysis to tell when search is really working. For the past few years I have seen a trend away from doing any “dog work” to get search solutions tweaked and tuned to ensure compliance with genuine business needs. People get cut, budgets get sliced and projects dumped because (fill the excuse) and the message gets promoted “enterprise search doesn’t work.” Here’s the secret, when enterprise search doesn’t work the chances are it’s because people aren’t working on what needs to be done. Everyone is looking for a quick fix, short cut, “no thinking required” solution.

This plays out in countless variations but the bottom line is that impatience with human processing time and the assumption that a search engine “ought to be able to” solve this problem without human intervention cripple possibilities for success faster than anything else.

It is time for search implementation teams to get realistic about the tasks that must be executed and milestones to be reached. Teams must know how they are going to measure success and reliability, then to stick with it, demanding that everyone agrees on the requirements before throwing the towel in at the first executive anecdote that the “dang thing doesn’t work.”

There are a lot of steps to getting even an out-of-the-box solution working well. But none is more important than paying attention to these:

  • Know your content
  • Know your search audience
  • Know what needs to be found and how it will be looked for
  • Know what is not being found that should be

The operative verb here is to know and to really know anything takes work, brain work, iterative, analytical and thoughtful work. When I see these reactions from IT upon setting off a search query that returns any results: “we’re done” OR “no error messages, good” OR “all these returns satisfy the query,” my reaction is:

  • How do you know the search engine was really looking in all the places it should?
  • What would your search audience be likely to look for and how would they look?
  • Who is checking to make sure these questions are being answered correctly
  • How do you know if the results are complete and comprehensive?

It is the last question that takes digging and perseverance. It is pretty simple to look at search results and see content that should not have been retrieved and figure out why it was. Then you can tune to make sure it does not happen again.

To make sure you didn’t miss something takes systematic “dog work” and you have to know the content. This means starting with a small body of content that it is possible for you to know, thoroughly. Begin with content representative of what your most valued search audience would want to find. Presumably, you have identified these people through establishing a clear business case for enterprise search. (This is not something for the IT department to do but for the business team that is vested in having search work for their goals.) Get these “alpha worker” searchers to show you how they would go about trying to find the stuff they need to get their work done every day, to share with you some of what they consider some of the most valuable documents they have worked with over the past few years. (Yes, years – you need to work with veterans of the organization whose value is well established, as well as with legacy content that is still valuable.)

Confirm that these seminal documents are in the path of the search engine for the index build; see what is retrieved when they are searched for by the seekers. Keep verifying by looking at both content and results to be sure that nothing is coming back that shouldn’t and that nothing is being missed. Then double the content with documents on similar topics that were not given to you by the searchers, even material that they likely would never have seen that might be formatted very differently, written by different authors, and more variable in type and size but still relevant. Re-run the exact searches that were done originally and see what is retrieved. Repeat in scaling increments and validate at every point. When you reach points where content is missing from results that should have been found using the searcher’s method, analyze, adjust, and repeat.

A recent project revealed to me how willing testers are to accept mediocre results when it became apparent how closely content must be scrutinized and peeled back to determine its relevance. They had no time for that and did not care how bad the results were because they had a pre-defined deadline. Adjustments may call for refinements in the query formulation that might require an API to make it more explicit, or the addition of better category metadata with rich cross-references to cover vocabulary variations. Too often this type of implementation discovery signals a reason to shut down the project because all options require human resources and more time. Before you begin, know that this level of scrutiny will be necessary to deliver good-to-great results; set that expectation for your team and management, so it will be acceptable to them when adjustments are needed for more work to be done to get it right. Just don’t blame it on the search engine – get to work, analyze and fix the problem. Only then can you let search loose on your top target audience.

Busy Week in XML Content Management Market

Holiday weeks can be sleepy weeks in enterprise software news, but this week has seen one significant press release each day in the XML content management market, or component content management (CCM) market if you prefer.

  • On Monday, SDL announced the acquisition of XyEnterprise, and the creation of a new business unit based on XyEnterprise and Trisoft called SDL-XySoft.
  • On Tuesday, Really Strategies, the makers of the Marklogic-Server-based system RSuite, announced the acquisition of SaaS CCM provider DocZone.
  • Today, Quark and EMC announced an integration of Quark XML Author with Documentum.

First, the necessary disclosures and caveats. Of the six companies mentioned, we’ve worked with all of them, I believe, and I actually worked for XyEnterprise back in the 1980s and early 1990s. That said, each of these announcements is significant.

SDL, through both organic growth and acquistion, has grown into a substantial business that spans globalization technology, globalization services, CCM technology, and WCM technology. My colleagues Mary Laplante and Leonor Ciarlone know them much better as a company, but I believe it is safe to say that SDL is in a unique position spanning essentially four markets, but four markets that make a great deal of sense under a single umbrella. The product support content managed in a CCM technology is the best point of integration for globalization/translation tools. A CCM technology is also an excellent underpinning for a global company’s web presence or web precenses (the latter more likely, especially when one considers the need for localized web sites). And services are an essential piece of this puzzle. It’s the rare company that staffs heavily for localization, and even when they do, very few would staff full time to cover all of their language needs. Is SDL in a position to represent one-stop shopping for large companies with complex product content that needs to be localized into many languages? Again, my colleagues could answer that question more precisely, but it’s not a crazy question to ask.

Mary has more on SDL XySoft over in the globalization blog.

The acquisition also breathes new life into XyEnterprise, a company with highly functional, mature technology and excellent executive leadership. We take it as a very positive sign that XyEnterprise CEO Kevin Duffy will become the CEO of the newly combined business unit, reporting to Mark Lancaster, Chairman and CEO of SDL.

The Really Strategies acquistion of DocZone is on a smaller scale of course, but it is is significant in that these two companies represent two leading trends in the CCM marketplace–management of component content in native XML repositories (MarkLogic Server for RSuite and Documentum Content Store for one version of DocZone) and Software as a Service (SaaS). Count me among those who have been skeptical at times about SaaS for CCM, but DocZone, under Dan Dube’s leadership, has made it work. Really Strategies, in the mean time, has developed an impressive CCM offering on top of Mark Logic Server, and they have quietly built up a strong customer list.  We think the combined companies complement each other, and the new management team is excellent, with Barry Bealer as CEO, co-founder Lisa Bos as CTO, Ann Michael in charge of services, and Dan Dube as VP Sales and Marketing.

Which brings us to Quark and EMC. Both companies have been developing more CCM capabilities. EMC acquired X-Hive, and a lot of XML expertise along with it. They have since added more XML expertise on both the product management and engineering side. As they have integrated X-Hive into the Documentum platform, they have logically looked to build out more capabilities and applications for vertical markets. The integration with Quark XML Author makes perfect sense for them, giving their customers and prospects a ready mechanism for XML authoring in a familiar editorial tool.

For Quark’s part, the move is a logical and very positive next step. They had previously announced this kind of integration with IBM Content Manager, which has a strong presence in the manufacturing space. With EMC, Quark now has a strong partner in the pharma space. Documentum has long dominated pharma, and Quark XML Author, under Michael Boses and previous owner In.Vision, had built up a long list of pharma customers. Boses and his team know the pharma data structures inside and out, and it will be interesting to see the details of how Quark XML Author will integrate with Documentum and its storage mechanisms. (I am sure both EMC and Quark see the potential as more than just the pharma market–government is also a good target here–but the pharma angle will be fruitful I am sure.)

So, what news is on tap for tomorrow?

SDL Scores with SDL XySoft

SDL continues its ambitious build-out of technology solutions for end-to-end content globalization with its acquisition of XyEnterprise, announced on 29 June. From Gilbane’s perspective, it’s a win all the way around, especially for buyers who continue to seek solutions for the more difficult obstacles to multilingual, multichannel publishing.

The vendors win. The acquisition brings immediate scale to both XyEnterprise and SDL Trisoft. Both companies were having to work really hard to reach the next level, and both were at risk of very slow progress through organic growth. The deep expertise and market focus of each company are highly complementary–SDL Trisoft with DITA and high tech, XyEnterprise with S100D in aviation and aerospace and a proven track record in commercial publishing. SDL Trisoft gets solid North American support and professional services organizations, and XyEnterprise gains the ability to better serve customers in Europe.

Buyers and customers win. First, the consolidation of two of the leading suppliers of component content management gives buyers a new comfort level with vendor viability. Second, efficient, affordable multilingual, multichannel publishing remains a very expensive obstacle for many global 2000 companies. In Gilbane’s new research on Multilingual Product Content, we identify the multilingual multiplier–costs that are solely the result of producing formatted content in another language. SDL XySoft will be able to address the multiplier problem with tight integration of the XyEnterprise XPP publishing engine, which has been a true differentiatior for Xy throughout its history. Third, existing and new customers will benefit from the extensive combined experience that SDL XySoft has in complex, standards-based publishing and content management.

The acquisition is also an opportunity to reinforce the core value propostions for XML and component content management. These technologies and practices sit at the nexus of a set of knotty problems: reusing content across applications, repurposing content for different outputs, and translating content for multiple global audiences. A single-vendor, integrated solution that addresses these problems is more evidence that the market is finally making progress towards overcoming the language after-thought syndrome, identified in Gilbane’s new study. Such solutions support the trend towards the:

“. . . steady adoption of content globalization strategies, practices, and infrastructures that position language requirements as integral to end-to-end solutions rather than as ancillary post-processes.” — Multilingual Product Content, Gilbane Group, 2009

This acquisition should be relatively easy for SDL to absorb, as there’s already an established business unit into which Xy’s capabilities fit (in contrast to SDL’s acquisitions of Trisoft and Tridion, which were completely new businesses for SDL). In addition, SDL XySoft has a proven leader in former XyEnterprise president and CEO Kevin Duffy. Duffy takes the role of XySoft CEO, reporting directly to SDL Chairman and CEO Mark Lancaster. Duffy managed to build a small niche software company into a respected player in its market, surviving through good and bad times. He now get his chance to see what’s possible with the resources of a global organization behind him.

See the SDL press release and the XyEnterprise press release for more information. Gilbane’s study on Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains will be published on the Gilbane site in mid-July. The report is currently available through study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR, Systran, and Vasont.

Lucid Imagination and ISYS Partner on Lucene/Solr

Lucid Imagination and ISYS Search Software announced a strategic partnership. The agreement enables Lucid Imagination to provide solutions that combine its core Lucene and Solr expertise with the ISYS File Readers document filtering technology. The flexibility of the architecture allows enterprises to develop sophisticated purpose-built search solutions. By offering ISYS File Readers as part of its Lucene/Solr solutions, Lucid Imagination gives users and developers out-of-the-box capability to find and extract virtually all of the content and formats that exist in their enterprise environment. Lucid Imagination Web site serves as a knowledge portal for the Lucene community, with wide range of information, resources and  information retrieval application, LucidFind to help developers and search professionals get access to the information they need to design, build and deploy Lucene and Solr based solutions. http://www.lucidimagination.com

Assessment of My Enterprise 2.0 Conference Predictions

The Enterprise 2.0 Conference was held last week, in Boston. Prior to the event, I made some predictions as to expected learnings and outcomes from the conference. Today, I will revisit those prognostications to determine their accuracy.

Here is the original list of things that I anticipated encountering at the E2.0 Conference this year. Each prediction is followed by an assessment of the statement’s validity and some explanatory comments:

A few more case studies from end user organizations, but not enough to indicate that we’ve reached a tipping point in the E2.0 market: TRUE The number of case studies presented this year seemed to be roughly the same as last year. That is to say very few. The best one that I heard was a presentation by Lockheed Martin employees, which was an update to their case study presented last year at E2.0 Conference. It was great to hear the progress they had made and the issues with which they have dealt in the last year. However, I was genuinely disappointed by the absence of fresh case studies. Indeed, the lack of new case studies was the number one conference content complaint heard during the event wrap-up session (indeed, throughout the show.)

An acknowledgement that there are still not enough data and case studies to allow us to identify best practices in social software usage:
TRUE This turned out to be a huge understatement. There are not even enough publicly available data points and stories to allow us to form a sense of where the Enterprise 2.0 market is in terms of adoption, much less of best practices or common success factors. At this rate, it will be another 12-18 months before we can begin to understand which companies have deployed social software and at what scale, as well as what works and what doesn’t when implementing an E2.0 project.

That entrenched organizational culture remains the single largest obstacle to businesses trying to deploy social software:
TRUE The “C” word popped up in every session I attended and usually was heard multiple times per session. The question debated at the conference was a chicken and egg one; must culture change to support adoption of E2.0 practices and tools, or is E2.0 a transformational force capable of reshaping an organization’s culture and behaviors? That question remains unanswered, in part because of the lack of E2.0 case studies. However, historical data and observations on enterprise adoption of previous generations of collaboration technologies tell us that leadership must be willing to change the fundamental values, attitudes, and behaviors of the organization in order to improve collaboration. Grassroots evangelism for, and usage of, collaboration tools is not powerful enough to drive lasting cultural change in the face of resistance from leadership.

A nascent understanding that E2.0 projects must touch specific, cross-organizational business processes in order to drive transformation and provide benefit: TRUE I was very pleased to hear users, vendors, and analysts/consultants singing from the same page in this regard. Everyone I heard at E2.0 Conference understood that it would be difficult to realize and demonstrate benefits from E2.0 initiatives that did not address specific business processes spanning organizational boundaries. The E2.0 movement seems to have moved from speaking about benefits in general, soft terms to groping for how to demonstrate process-based ROI (more on this below.)

A growing realization that the E2.0 adoption will not accelerate meaningfully until more conservative organizations hear and see how other companies have achieved specific business results and return on investment: TRUE Conference attendees were confounded by two related issues; the lack of demonstrative case studies and the absence of a clear, currency-based business case for E2.0 initiatives. More conservative organizations won’t move ahead with E2.0 initiatives until they can see at least one of those things and some will demand both. People from end user organizations attending the conference admitted as much both publicly and privately.

A new awareness that social software and its implementations must include user, process, and tool analytics if we are ever to build a ROI case that is stated in terms of currency, not anecdotes:
TRUE Interestingly, the E2.0 software vendors are leading this charge, not their customers. A surprising number of vendors were talking about analytics in meetings and briefings I had at the conference, and many were announcing the current or future addition of those capabilities to their offerings at the show. E2.0 software is increasingly enabling organizations to measure the kinds of metrics that will allow them to build a currency-based business case following a pilot implementation. Even better, some vendors are mining their products’ new analytics capabilities to recommend relevant people and content to system users!

That more software vendors that have entered the E2.0 market, attracted by the size of the business opportunity around social software:
TRUE I haven’t counted and compared the number of vendors in Gartner’s E2.0 Magic Quadrant from last year and this year, but I can definitely tell you that the number of vendors in this market has increased. This could be the subject of another blog post, and I won’t go into great detail here. There are a few new entrants that are offering E2.0 suites or platforms (most notably Open Text). Additionally, the entrenchment of SharePoint 2007 in the market has spawned many small startup vendors adding social capabilities on top of SharePoint. The proliferation of these vendors underscores the current state of dissatisfaction with SharePoint 2007 as an E2.0 platform. It also foreshadows a large market shakeout that will likely occur when Microsoft releases SharePoint 2010.

A poor opinion of, and potentially some backlash against, Microsoft SharePoint as the foundation of an E2.0 solution; this will be tempered, however, by a belief that SharePoint 2010 will be a game changer and upset the current dynamics of the social software market:
TRUE Yes, there are many SharePoint critics out there and they tend to be more vocal than those who are satisfied with their SharePoint deployment. The anti-SharePoint t-shirts given away by Box.net at the conference sum up the attitude very well. Yet most critics seem to realize that the next release of SharePoint will address many of their current complaints. I heard more than one E2.0 conference attendee speculate on the ability of the startup vendors in the SharePoint ecosystem to survive when Microsoft releases SharePoint 2010.

An absence of understanding that social interactions are content-centric and, therefore, that user generated content must be managed in much the same manner as more formal documents:
FALSE Happily, I was wrong on this one. There was much discussion about user generated content at the conference, as well as talk about potential compliance issues surrounding E2.0 software. It seems that awareness of the importance of content in social systems is quite high among vendors and early adopters. The next step will be to translate that awareness into content management features and processes. That work has begun and should accelerate, judging by what I heard and
saw at the conference.

So there are the results. I batted .888! If you attended the conference, I’d appreciate your comments on my perceptions of the event. Did you hear and see the same things, or did the intense after hours drinking and major sleep deficit of last week cause me to hallucinate? I’d appreciate your comments even if you weren’t able to be at E2.0 Conference, but have been following the market with some regularity.

I hope this post has given you a decent sense of the current state of the Enterprise 2.0 market. More importantly, I believe that this information can help us focus our efforts to drive the E2.0 movement forward in the coming year. We can and should work together to best these challenges and make the most of these opportunities.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑