Curated for content, computing, and digital experience professionals

Month: January 2008 (Page 2 of 5)

Our “New/Old” XML Practice

Today we announced our new “XML Technologies & Content Strategies” consulting service. The service will be led by Lead Analyst Bill Trippe, who is joined by Mary Laplante and Leonor Ciarlone. See the press release, and Bill’s introductory post on the practices new blog at Bill, Mary, and Leonor all have long and deep experience in this area and make an exceptionally strong team. You can reach them at:

You’ll note the “New/Old” in this post’s title. Many readers will know that this is because we have always been involved in XML consulting, and before it existed were involved in SGML consulting, which of course is where XML came from. In fact, though we have changed the name of the company a couple of times, our original company was formed in 1986 to advice to organizations like the DoD, Department of Commerce, Lockheed, Fidelity, American Airlines, and many more, on the use of descriptive markup languages and meta-languages like SGML. In fact I first met Bill in 1987 when he was at Mitre investigating SGML. You can still read a lot of our monthly reports from the 90’s that cover markup technologies, although Tim Bray, who edited the Gilbane Report in the late 90s and is one of the authors of the XML standard didn’t write much about it then since XML was still in “stealth” mode. It was also important then to stay neutral about standards, which obviously would have been tough for Tim at the time.

So if we’ve been doing this all along, what’s new? In short, critical mass, information infrastructure, and demand. The sheer volume of XML being created is reaching a level that demands enterprise strategic attention. XML is already part of many organizations information infrastructure whether they know it or not. And while many of our consulting clients are focused on specific applications, there are also many who are looking at the big picture and really want to understand what information encoded in XML can do strategically for their business. More from today’s press release:

Gilbane’s XML Technologies and Content Strategies Practice is designed for IT and business managers who need to gain control of critical content, increase collaboration across enterprise applications, improve efficiencies through faster and more flexible information distribution between business partners and customers, and implement new business models that can keep pace with today’s internet-speed competitive requirements. The amount of XML content being generated today is staggering, as large infrastructure providers like Microsoft, IBM, Google, Oracle, and others offer tools and technologies that generate and manage XML information, While many organizations are taking advantage of XML within departmental applications, most companies are not even close to taking advantage of the XML information being created and utilized by popular applications including office software and database repositories. Significantly, many executives are unaware of the XML content and data that are untapped assets within their organizations.

Welcome to XML Technologies and Content Strategies

As Frank noted in our main blog and in the related press release, this blog is part of our launch this week of a new practice focused on the technologies, strategies, and best practices associated with using XML in content management. With this focus on XML, the new practice is broad–XML is fundamental to so many aspects of content management. Yet the focus on XML also compels us to look at content management through a certain lens. This begins with the vendor offerings, where nearly every platform, product, and tool has to meet anywhere from a few to myriad XML-related requirements. As XML and its related standards have evolved and matured, evaluating this support has become a more complex and considered task. The more complex and feature-rich the offering, the more difficult the task of evaluating its support.

And indeed, the offerings are becoming more complex, especially among platform vendors like Microsoft, IBM, and Oracle. Looking at SharePoint means evaluating it as a content management platform, but also looking specifically at how it supports technologies like XML forms interfaces, XML data and content feeds, and integration with the XML schemas underlying Microsoft Word and Excel. It also means looking at SOA interfaces and XML integration of Web Parts,and considering how developers and data analysts might want to utilize XML schema and XSLT in SharePoint application development. Depending on your requirements and applications, there could be a great deal more functionality for you to evaluate and explore. And that is just one platform.

But understanding the vendor–and open source–offerings is only one piece of the XML content management puzzle. Just as important as choosing the right tools are the strategic issues in planning for and later deploying these offerings. Organizations often don’t spend enough time asking and answering the biggest and most important questions. What goals do they have for the technology? Cost savings? Revenue growth? Accelerated time to market? The ability to work globally? These general business requirements need to then be translated into more specific requirements, and only then do these requirements begin to point to specific technologies. If XML is part of the potential solution, organizations need to look at what standards might be a fit. If you produce product support content, perhaps DITA is a fit for you. If you are a publisher, you might look at XML-based metadata standards like XMP or PRISM.

Finally, XML doesn’t exist in a content management vacuum, removed from the larger technology infrastructure that organizations have put in place. The platforms and tools must integrate well with technologies inside and outside the firewall; this is especially true as more software development is happening in the cloud and organizations are more readily embracing Software as a Service. One thing we have learned over the years is that XML is fundamental to two critical aspects of content management—for the encoding and management of the content itself (including the related metadata) and for the integration of the many component and related technologies that comprise and are related to content management. Lauren Wood wrote about this in 2002, David Guenette and I revisited it a year later, and the theme recurs in numerous Gilbane writings. The ubiquitous nature of XML makes the need for strategies and best practices more acute, and also points to the need to bring together the various stakeholders–notably the business people who have the content management requirements and the technologists who can help make the technology adoptions successful. Projects have the best chance of succeeding when these stakeholders are brought together to reach consensus first on business and technical requirements, and, later, to reach consensus on technology and approach.

As Frank noted, this is “New/Old” news for all of us involved with the new practice. I first discussed SGML with Frank in 1987 when I was at Mitre and responsible for a project to bring new technology to bear on creating specifications for government projects. Frank had recently launched his technology practice, Publishing Technology Management. Leonor was a client at Factory Mutual when I worked for Xyvision (now XyEnterprise) in the early 1990s. And I probably first met Mary at a GCA (now IDEAlliance) event during my Xyvision days and when she worked for a competitor, Datalogics. We are, in the polite vernacular of the day, seasoned professionals.

So welcome to the new blog. Watch this space for more details as we announce some of the offerings and initiatives. I plan to blog actively here, so please add the RSS feed if you prefer to digest your material that way. If you have ideas or suggestions, don’t hesitate to post here or contact me or any of the other analysts directly. We look forward to the interaction!

Search Adoption is a Tricky Business: Knowledge Needed

Enterprise search applications abound in the technology marketplace, from embedded search to specialized e-discovery solutions to search engines for crawling and indexing the entire intranet of an organization. So, why is there so much dissatisfaction with results and heaps of stories of buyer’s remorse? Are we on the cusp of a new wave of semantic search options or better ways to federate our universe of content within and outside the enterprise? Who are the experts on enterprise search anyway?

You might read this blog because you know me from the knowledge management (KM) arena, or from my past life as the founder of an integrated enterprise library automation company. In the KM world a recurring theme is the need to leverage expertise, best done in an environment where it is easy to connect with the experts but that seems to be a dim option in many enterprises. In the corporate library world the intent is to aggregate and filter a substantive domain of content, expertise and knowledge assets on behalf of the specialized interests of the enterprise, too often a legacy model of enterprise infrastructure. Librarians have long been innovators at adopting and leveraging advanced technologies but they have also been a concentrating force for facilitating shared expertise. In fact, special librarians excel at providing access to experts.

We are drowning in technological options, not the least of which is enterprise search and its complexity of feature laden choices. However, it is darned hard to find instances of full search tool adoption or users who love the search tools they are delivered on their intranets. So, I am adopting my KM and library science modes to elevate the discussion about search to a decidedly non-technical conversation.

I really want to learn what you know about enterprise search, what you have learned, discovered and experienced over the past two or three years. This blog and the work I do with The Gilbane Group is about getting readers to the best and most appropriate search solutions that can make positive contributions in their enterprises. Knowing who is using what and where it has succeeded or what problems and issues were encountered is information I can use to communicate, in aggregate, those experiences. I am reaching out to you and those you refer to complete a five minute survey to open the door to more discussion. Please use this link to participate right now Click Here to take survey. You will then have the option to get the resulting details in my upcoming research study on enterprise search.

Just to prove that I still follow exciting technologies, as well, I want to relay a couple of new items. First is a recent category in search, “active intelligence,” adopted as Attivio’s tag line. This is a start-up led by Ali Riaz and officially launched this week from Newton, MA. Then, to get a steady feed of all things enterprise search from guru Steve Arnold, check out his new blog, a lead up to the forthcoming Beyond Search: What to Do When Your Search Engine Doesn’t Work to be published by The Gilbane Group. You’ll be transported from the historical, to the here and now, to the newest tools on his radar screen as you page from one blog entry to another.

EMC Announces Captiva eInput 2.0

EMC Corporation (NYSE:EMC) announced its newest distributed document capture solution that offers advances in Web-based distributed capture, EMC Captiva eInput 2.0. eInput 2.0 ia designed to make the scanning and indexing of paper documents from remote offices faster and easier, automating the classification of documents, extraction of data, and validation of information directly from a Web browser. Captiva eInput works as an extension to the Captiva InputAccel platform, delivering distributed capture with the EMC Documentum platform to address transactional content management (TCM) applications, such as loan processing, insurance claim processing, invoice processing, new account enrollment, and case management. In addition, working with InputAccel, eInput integrates with a wide array of back-end systems, including enterprise content management (ECM), business process management (BPM), and other enterprise applications. Together, these solutions enable organizations to add distributed capture capabilities to their existing business processes and information infrastructure.

Would Margaret Fuller have joined TAUS?

More than likely. One of her more famous quotes was: “If you have knowledge, let others light their candles with it.”

If Fuller was still alive, would social networking have forged a connection somehow with Jaap van der Meer, Director of the Translation Automation User Society, otherwise known as TAUS? I’d bet money on it.

Long known as a language industry pioneer and visionary, van der Meer directs the TAUS-driven call for knowledge sharing as the driver of change for the translation industry. Efforts such as:

    • spearheading a language data sharing initiative, including a planned platform for cross-industry sharing
    • providing executive forums to discuss and design new translation business models, and
    • mapping the roadmap to share translation memories

are just a few examples of the innovation within this proactive organization.

Since November 2004, TAUS has managed to bring together more than sixty companies that exchange user cases, best practices and technology roadmaps specific to the language industry. The resulting membership is far from a “weighted” crowd; rather, it is a well-rounded collection of end users, service providers and technology vendors with a shared interest for change.

Shared vision. Common goals. Concrete results. No more secret languages.

This mission statement for 2008 defines how TAUS plans driving an “agenda of change” that stimulates innovation, automation and collaboration for the industry. Impressive goals, to say the least. Find out more by requesting a copy of the TAUS Annual Plan 2008. It is an excellent read.

The Social Language

Although it is already mid-January, I would still like to wish everyone a very good 2008! It definitely looks to be an interesting year.

Back to blogging, after a very long pause. The reason was my major geographical transition: after 8 very nice years in Boston, we returned to the bi-lingual Finland and the very multi-lingual European union last autumn. The time required for a trans-Atlantic move is not to be underestimated!

Leonor’s interview with Director General Lonnroth about the languages in the EU is an excellent description of the world on this side of the Atlantic. On a very personal note, I love tuning to YLE Mondo radio every time I am driving; a local station broadcasting news from several different countries. I even get the NPR! I listen to German, French, Spanish, and Italian news, and at the same time notice the differences there are not just in the language, but also in the content. Even more fun is to listen to news from Australia and South Africa, which really change the world perspective. A good reminder that from Africa or Australia, many things do look different than from the US or from Europe. How lovely it would be to understand what they say in Chinese, Japanese or Arabic, to name just a few languages!

Anyways, things are finally starting to find their places in their new home, so I am back to blogging. We had a wonderful Gilbane conference in Boston at the end of November; it got so many ideas going in my head, especially about the social aspects of content, search, collaboration – and of course language. The question “Where are languages in social media” was asked in the conference, and the first answer was on the lines of: gee, that is a tough thing to solve. True – and yet I am convinced that we will begin to see very new types of tools and solutions. It was interesting to note that several examples were given on how in corporations social media enabled people find a language speaker inside the organization. “Through our collaboration tool, we found someone who speaks Japanes and can check our translations.” “We realized someone in our German office could translate the materials we needed.” Language skills become yet another skill to be shared in communities.

Another interesting point was that MT and its usefulness came up. With the amount of user-generated information exploding, there is no chance to human-translate everything. Could this be the real coming of age of MT?

I spoke with one multilingual service provider who said that they have started receiving requests for checking user-generated content in corporate community sites. Interesting. I would guess that need for automated checking of “bad words” increases as more content on corporate sites comes not from employees but from anyone in the web. Enterprise searches have to be multilingual, but there is always room to improve.

As Leonor pointed out: collaboration yields knowledge. That knowledge is multilingual.

Oracle to Acquire BEA Systems

Oracle Corporation (NASDAQ: ORCL) and BEA Systems (NASDAQ: BEAS) announced they have entered into a definitive agreement under which Oracle will acquire all outstanding shares of BEA for $19.375 per share in cash. The offer is valued at approximately $8.5 billion, or $7.2 billion net of BEA’s cash on hand of $1.3 billion. The Board of Directors of BEA Systems has unanimously approved the transaction. It is anticipated to close by mid-2008, subject to BEA stockholder approval, certain regulatory approvals and customary closing conditions.

Coveo Updates Enterprise Search Technology

Coveo Solutions Inc. announced that it has added new capabilities to its Coveo Enterprise Search technology. The following new capabilities are now available: Greater scalability with support for Windows 64-bit operating systems that improves search indexing and query performance and expands access to more powerful servers; New connectors for Symantec Enterprise Vault v2 offers more flexibility and controls; Out-of-the-box integration with Microsoft Exchange and Symantec Enterprise Vault email archives allows for integrated search across all corporate email content; Re-factored connector enables indexing and search of massive email archives; enhanced connector for improves performance and overall user experience; easier to deploy and delivers a CRM search interface out-of-the-box; and improved performance and precision of indexing and searching.

« Older posts Newer posts »

© 2021 The Gilbane Advisor

Theme by Anders NorenUp ↑