Curated for content, computing, and digital experience professionals

Author: Leonor Ciarlone (Page 11 of 13)

OpenText Bolts Through the Hummingbird Door

Update 7/24: Let the talks begin… With the review of the Symphony bid now officially postponed, the door widens for what could be an interesting bidding war in the ECM market. Asking for at least 10 cents more per share than OpenText has offered, Hummingbird has rescheduled the Symphony bid review for August 18th while negotiations take place. Considering both suitors are shareholders, (OpenText’s 22.3% versus Symphony’s 18%) it is unlikely that either will back out without some amount of drama.
Update 7/13: According to a press release, the Hummingbird Board of Directors will not issue a recommendation on the OpenText bid before July 25. Until that time, it is advising Hummingbird shareholders to “take no action,” and support the Symphony acquisition. Interestingly, the review of the Symphony deal will take place July 21. We’ll keep you posted.
Clearly the door was open. Altough I called myself “stunned” that the bid for Hummingbird was not a technology to technology play, I remain so given that OpenText was not the player I thought “most likely to acquire.” In fact, it was no secret that OpenText was one of the players “most likely to be acquired!” I’m thinking the final yearbook for the class of 2006 may have more surprises.
The OpenText bid is a “lock-up” agreement, which according to Information World Review, means that Hummingbird shareholders agree to a deposit from OpenText and not to withdraw from the deal. (Subject to timing and regulatory compliance issues.)
Aside from the many debates to be had on the consolidation effect of this deal, Hummingbird shareholders and financial analysts must certainly be gratified at the 20% increase in the OpenText versus Symphony bids. More at Image and Data Manager Online, CMS Watch, and Bloomberg.
Whether this deal happens is still up for grabs. OpenText’s bid is due by the end of the week. We’ll keep you posted.

The ECM and BPM Intersection: Defining “More Than Simple” Workflow

As a former glue person, I spent numerous hours trading acronyms and definitions with IT analysts on the subject of data and process modeling in the content versus data worlds. Circa 1999, my friend Bob Boeri and I even went so far as to relate logical data models and data dictionaries to DTD structures, using Near and Far Designer as analogous to the more entrenched data modeling tools.

Our goal was to create “common ground” between IT’s deep but solely data-centric view of business applications and the needs of various business units whose focus was decidedly document-centric. Once our “data is content and content is data” analogy was mantra, we had an easier time with subsequent process modeling discussions; i.e. “what we want our content to do with your data” and vica versa. (Reminiscent of those “how did your chocolate get into my peanut butter commercials”)

In the ECM and BPM intersection, those discussions are once again becoming commonplace as more and more complex business processes require hybrid combinations of unstructured content, structured XML content and traditional data from back-end systems. Hence, information analysts that work with IT and business units must define a common knowledge base of process modeling requirements, flows, and techniques.

More than simple workflow (a.k.a “create, edit, approve, publish”), process models for functions such as compliance, claims processing, and contract management need to combine data-centric techniques with the content-centric, human-driven interactions these functions require. In fact, just as data sources are now hybrids, so too are the processes that require, manipulate, and share them. The BPM suite market is increasingly adding simple document management functionality at the business monitoring level to account for content-centric requirements.

More interesting is the market’s approach to workflow, which still appears either data-centric or document/content-centric in terms of standards modeling languages. In fact, a BPM suite vendor’s architecture choice for process modeling and execution is also a clue to their data versus content strengths via support for XPDL (XML Process Definition Language) versus BPEL (Business Process Execution Language). Highlights:

  • XPDL – initiated and managed by the Workflow Management Coalition (WfMC), XPDL is decidedly human workflow-centric and more oriented for document-driven processes. No surprise that workflow, document management pure-plays, and some ECM players with BPM modules have strong XPDL modeling and processing engines. More info at
  • BPEL – originally submitted to OASIS from IBM, Microsoft and BEA, BPEL is decidedly data-centric and more oriented for straight-through processing. No surprise that platform and middleware vendors entering the BPM suite market have strong BPEL modeling and processing engines. More info at BPELSource and
  • OASIS One of the more significant questions at the ECM and BPM intersection is, “Where is the best of both worlds in terms of process modeling for complex workflow that is a human and data-driven hybrid?”

The ECM and BPM Intersection

We’ve been monitoring acitivity in the BPM market with an eye on the connections between ECM and BPM technologies as they apply to content-centric business processes and applications. The evolution of BPM suites has been particularly interesting and in many ways, analogous to the patterns that formed the current ECM suite market. Technology convergence, vendor consolidation, a full slate of interchangeable acronyms, and rising levels of market confusion surrounding the definitive list of suite-level components are all evident as the BPM suite market continues to define itself. Sound familiar?

BPM suites are clearly an emerging market. Broadly defined as the ability to model, execute, simulate, and optimize business processes, the market consolidates technologies such as analytical modeling, rules design and execution, workflow, data aggregation, and process optimization into a single platform vision. Numerous pure-play BPM providers within each technology segment are evolving toward “the vision” in different ways.

I am positive that this is not a “never the twain shall meet” situation when it comes to content strategies and ECM technologies. Process and content are siblings; it is only a matter of time before many of the isolated technologies that support both will merge in a more tangible manner than simple workflow. This kind of ECM and BPM intersection is more complex than the traditional integration of the BPM market’s straight-through processing (STP) expertise with data-centric, transactional content. Rather, it will be an emerging focus on what we view as process content, or content that travels through a complex, human-driven, interactive, and iterative lifecycle.

EMC’s acquisition of ProActivity is a tangible indicator of this evolving intersection, demonstrated as well by BEA’s acquisition of Fuego, FileNet’s ongoing investment in its BPM components, the progression of DM/BPM players such as Global 360 and Hyland Software, and Lombardi Software’s integration with Microsoft Office. Stay tuned for more as the market heads toward cohesive vertical and horizontal solutions — critical for both traction and helping the user community understand implementation value. We’ll keep you posted.

Digital Content: Federal Focus on Music

There is an amendment being considered in the House of Representative to tackle the digital music phenomenon yet again. Called the “Section 115 Reform Act of 2006,” it focuses specifically on “licenses for digital uses of musical works.” Having gotten my 11 year old daughter her coveted iPod at the end of May, this caught my eye. The draft discussion is dated May 12.

According to Chris Lindquist over at the CIO Magazine blog, the bill “could make it necessary to acquire licenses for every digital copy of content, even cached, network, and RAM buffer reproductions.” Makes one wonder what bills may be in store for digital content in general, not just in the music industry…

Hummingbird Acquisition: Open Door or Not?

See updated links for 5/30 and 5/31 below.
Friday was a busy day and I did not see the press release on the Hummingbird acqusition until about 3pm. Curiousity killed that cat and I took some time to listen to the archived conference call (find the number in the “Conference Call” section of the Hummingbird press release.)

I got more than I bargained for on a Friday afternoon. Not surprised to find an audience of financial analysts, I was more than a bit surprised to hear comments such as “stunned,” “ridiculous,” and “questionable as to fiduciary responsibility.” Cetainly, many of the financial analysts asked (redundant) questions in the manner that reporters would use, i.e. slow, steady, and determined to get an answer. Others however, were quite more emotional than I’ve ever experienced from an acquisition- or earnings-type call — or from financial analysts for that matter. Some analysts advised shareholders to “vote against this” with vigor. It got so interesting that I realized I had listened to the entire call without intending to.

I must say I too was “stunned” at the announcement because the acquisition was not a technology to technology play. I have followed Hummingbird for years and think they have done a great job educating the market on ECM as well as expanding a very tangible beachhead in the legal vertical. So *my* stunned was that I thought it would be… well, just someone else! Just who is Symphony Technology Group? According to their Web site, it is a strategic holding company. According to Hummingbird, it was the only *serious* bidder they spoke to about an acquisition and talks began in February.

The “open door or not” title describes the crux of the emotion on the part of the financial analysts. In essence, Hummingbird described a process in which Symphony approached Hummingbird. Hummingbird did not solicit other bids from other financial or technology vendors. At the same time however, Hummingbird was repeatedly adament at stating that Symphony was the only serious bidder. Clearly there was at least one more.

In response to repeated analysts’ opinions that the valuation was extremely low, that the company was worth far more, and that the bidding process should have been more open, the Hummingbird response was: “The door is now open, other bidders can come to the table; we were not shopping – we did not put ourselves on the block.” The “or not” part of the “is the door open?” question is that simultaneously, Hummingbird stated that the process will move swiftly and the company is confident that Sympony has no other technology company holdings that overlap Hummingbird’s expertise in ECM. Also according to Hummingbird, “nothing has changed in our company” and there are no management contracts in place with Symphony for the deal.

The documents on full disclosure on the details will be available tomorrow, Tues 5/30, according to Hummingbird. I am sure more blog entries will add to my report. I’ll update this entry with links I find tomorrow.

Reuters has weighed in…

Tony Byrne from CMSWatch has weighed in…

Tuesday 5/30 Update:
Computer Business Review Online has weighed in…

Wednesday 5/31 Update:
Canada.com’s National Post has weighed in… Note the quote “Fred Sorkin and Barry Litwin, Hummingbird’s chairman and chief executive, respectively, own about 12% of the outstanding shares and don’t want the company bought by another technology firm. The company might cut [our] products,” said Mr. Sorkin, although all offers will be entertained. This leads to distraction and lack of value creation.”

Arrangement Agreement Papers Available… The site is www.sedar.com, “the official site that provides access to most public securities documents and information filed by public companies and investment funds with the Canadian Securities Administrators (CSA) in the SEDAR filing system.” Search for Public Companies = Hummingbird + Date Filed = May 26,2006 if you are interested. Curiously, the Document Type is listed as “Other”.

More on Microsoft

One of the publications I find very useful and always relevant is Knowledge@Wharton, produced by the Wharton School at the University of Pennsylvania. Check out this timely article, Microsoft’s Multiple Challenges: Is its Size a Benefit or a Burden?

Excerpt: “Microsoft announces that it will spend about $2 billion to fend off rivals such as Google and thwart Sony’s video game ambitions, and the company loses more than $30 billion in market capitalization in a day. Fair trade or overreaction?” My favorite quote? Wharton finance professor Andrew Metrick’s comment that “$2 billion to Microsoft is like a pimple on an elephant.”

Certainly true, but the investment demonstrates the range of rivals Microsoft faces in retaining its “titan” status for the long-term.

Tuning in to Web 2.0: The SafariU Channel

Hopefully some of you tuned in to our webinar yesterday and have had a chance to read the companion whitepaper. My radio theme – or podcast if you are so inclined – for the title of this blog is intentional. In fact, I also toyed with “Mixing Content and Web 2.0” to illustrate “the remix factor” — an intrinsic part of the Web 2.0 “engaging the user” vision and one of the reasons why professors call O’Reilly Media’s SafariU “revolutionary.”

Remixing. Familiar to your teenagers and made famous by iTunes, but not a word well known in corporate circles. Using Web services and MarkLogic Server, O’Reilly delivers a user interface that allows higher education professors to reassemble – or remix – sections and chapters from a vast library of O’Reilly and partner books to, in CJ’s words, suit their needs. Suit their needs. Since when do software applications suit the user needs without the word “customization” being part of the equation?

In terms of content applications and Web 2.0, since now. Is this analogous to the radio industry’s evolution? Absolutely. Can it provide new revenue for publishers through a compelling product? Definitely. Ian Krantz over at the Really Strategies blog continues the conversation. And CJ Rayhill , O’Reilly’s Chief Information Officer and General Manager of O’Reilly’s Education Division, is obviously the source.

Yesterday, the webinar audience asked me what parts of the SafariU story are universally applicable. Read on to see what I said. Also, feel free to submit questions and comments here about what you read and hopefully listened to about the SafariU case study. (I will let you know when the archive is available). Let’s continue the conversation!

What O’Reilly success factors are universal?

“When the Gilbane team evaluates a customer story as a potential CTW case study, we specifically look for elements of the deployment that would benefit other adopters of content technologies. So, how can we generalize O’Reilly success? Here are a few key factors that are universal.

First, we could not agree more with the Web 2.0 principle that — Data (including O’Reilly’s atomized content term — is the next Intel Inside. Having spent over 20 years researching and writing about content technologies, The Gilbane Report has consistently focused on how content technology can be used for enterprise business applications and how content and computing will evolve. Today, the power of “content as a corporate asset” is clearly one of the success factors for a myriad of business applications, commercial products, and community and government services. The same can be said for the rising intersection between content, collaboration and community – technology is enabling it and SafariU has clearly delivered it.

Secondly, as XML enjoys its eighth birthday this month, its application to gold source content is evident throughout many industries. Although regularly applied to data exchange during its first five years, it is the more recent years that demonstrate the value of content intelligence, flexibility, and reuse as enabled by XML and sister standards like XQuery. This value is reaping significant ROI for those making the commitment and investment.

Finally, O’Reilly’s is engaging their customers in new ways while simultaneously delivering strategic improvements to higher education. Their approach demonstrates the power of CJ’s infrastructure quote when describing Mark Logic Server, which gives O’ Reilly the power to single source both their content and infrastructure expand into higher education today, and more verticals in the near future. These are universal factors that you can take back as input to your own content strategies.” Leonor Ciarlone

“DITA” Help

I had flashbacks as I sat in the DITA session at the Boston Gilbane Conference. True flashbacks. Back to the days of creating a complex automated compilation “system” to create context-sensitive help for a Windows-based manufacturing control application. Partnered with an object-oriented developer who had better things to do than “play nice” with a technical writer, we managed to build a routine based on Word macros, RTF, Excel, and DLLs to output coded Microsoft help files linked directly to RC files. Convoluted, but it made us proud.
The flashback was not about the coding, although I felt compelled to document the story. It was more about the writing methdology developed with my fellow technical writers. All about standard topics, we developed a core set of help panels based on chunking information into concepts, procedures, reference info (UI and dialog box help) and glossary items. We developed a simple hypertext strategy with non-negotiable rules for what should link to what — and when. (Ended up with a nice triangle graphic for a cheatsheet.) It worked so well that I wrote and delivered a help standards paper for ACM in…. 1993. Still lives!

So, back to the DITA session, which was excellent — CM4 featuring IBM and Autodesk — two real-life and useful stories of implementers from the documentation trenches. Bill wrote about DITA in practice back in October, noting that Adobe techdoc”ers” are also DITA users.

And finally, back to the point of writing methdologies (aka content strategy component,) which I believe is one of the key drivers of the rapid adoption of DITA. DITA = topics = chunking. It is as much a methodology as it is a technology. Information Mapping, Inc., well-known to techdoc folks as a longtime proponent of information organization = usability, clearly agrees. They have rolled their methodology quite nicely into Content Mapper, blending DITA in as well. Their entry into the authoring software market, full of vendors with equally strong heritage, is a good sign for those following the pulse of ECM as strategy (more on that later.)

Takeaways? Information architecture is hot. Technical writer with online help expertise = DITA fan. Getting information from those in the trenches is key — check out What’s New at Gilbane.com and register for a discussion on real-world DITA adoption on January 11th.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑