Curated for content, computing, and digital experience professionals

Month: July 2005 (Page 1 of 5)

Oracle Buys Context Media

Oracle has bought the assets of Context Media. The content integration technology will complement Oracle’s Content Services 10G and collaboration products. There is no official announcement on the details of the acquisition.

WebSideStory Debuts Active Dashboard with Launch of HBX 3.0

WebSideStory (NASDAQ:WSSI) announced the launch of HBX Analytics 3.0, the latest version of its web analytics service. HBX 3.0 introduces Active Dashboard, web analytics tool that enables marketers to conduct real-time business scenario analysis to forecast the effect of web site changes. Offered as an extended service through WebSideStory’s professional services team, Active Dashboard gives marketers a graphical representation of their web metrics and how those metrics affect their business objectives. HBX 3.0 is the cornerstone of the WebSideStory Active Marketing Suite, a collection of integrated, on-demand digital marketing applications that includes web analytics, site search, web content management and keyword bid management.

Aligning Expectations With XBRL’s Maturity

A couple of days ago I wrote about an instance of XBRL’s leaping over the market chasm to see use in a no-nonsense, pragmatic, “early majority” application. This isn’t just idle marketing chatter. The question of where XBRL stands along the technology adoption curve is one that any organization or company thinking about using XBRL needs to be asking. Just how mature is this technology? How big a bet can you put on it? And if you do make a bet, what steps do you need to take to hedge it?

Just in case some readers are not familiar with Geoff Moore’s work in the area of how technologies get adopted, here is a picture of what we are talking about.

The gist of Moore’s argument is that the movement from Early Adopters to the mainstream market is discontinuous. The things that attract Early Adopters to a technology are not the same things that matter to the primary market. This isn’t just a small problem. Many promising technologies never make it across the chasm. They get hyped in the technology press and by the people who are excited because the technologies are elegant or innovative–but many of these technologies fall into the chasm and never get to the point where they make a difference in the daily operations of most companies.

Working on both sides of the chasm to effect a crossing is a tricky problem. You need Early Adopters to get a new technology started. Early Adopters are the organizations, and the people within them, who see a chance to use a new technology to build an entirely new market or to gain sudden, overwhelming competitive advantage. They are visionaries. They invest in the new technology when no one else will because they hope to change the rules of the game. Early Adopters are willing to take big risks in order to get a shot at big returns. Without Early Adopters, new technologies would never get out of the lab.

Early Majority buyers have a very different view of risk. They are managers, not visionaries, and are interested in new technologies when the technologies are the only way to reach key objectives. Early Majority buyers are not wholly risk averse, but they do need the reassurance that comes from seeing a variety of vendors offering a technology. They have no interest in being the first and only company out on a technology frontier. They also need to see a clear, near term business case before they invest in a new technology. As I noted in my article a couple of days ago, the adoption of XBRL for banking call reports is a good example of a pragmatic, early majority application. Risks can be controlled, there will be a substantial, certain payoff from Call Report Modernization, and XBRL is clearly the best way to do the job.

Not surprisingly, movement across the chasm does not happen all at once. Even as the first, highly focused applications move across the chasm, there continues to be a lot of visionary Early Adopter activity over on the left side. That is certainly true of XBRL today.

This business of being on both sides of the chasm can work out fine so long as everyone knows what side they are operating on. But you run into trouble when a buyer on the right side of the chasm, from the Early Majority, ends up with a solution that belongs somewhere over on the left side, in the world of the Early Adopters.

Last week’s XBRL conference in Washington presented an example of just such a straddling of the chasm. Barry Ward, Vice President and Head of Financial Reporting at ING Insurance Americas, U.S. Financial Services, described an XBRL initiative–a “proof of concept” effort–that his company undertook over the past year. ING sought to streamline internal reporting, eliminate rekeying of data, improve audit trails, enhance understanding of financial results within the company, and automate the creation of the many state reports that the company must produce. (Each state regulates the insurance business in its own way, with its own reporting requirements.) Ward noted that mere external reporting was NOT of interest as a stand-alone effort at ING since this would not improve efficiency, but would instead actually add another step to the reporting process. Internal use of XBRL, on the other hand, looked as if it could save money.

The U.S. unit of ING was upgrading its accounting software, and the vendor (left unnamed by Mr. Ward–later identified as a major ERP vendor) claimed to offer XBRL support as part of the general ledger program. This appeared to present an opportunity for ING to see if it could use XBRL to reduce financial reporting costs while improving the quality and utility of the reports.

Unfortunately, the project did not go well. One problem was that the general ledger vendor had not fully implemented XBRL, but had, instead, just hard-coded a set of XBRL tags into the system. So, there was no easy way for ING to update the system to make use of more recent work on the XBRL-GL taxonomy or to extend it to address the unique problems faced by the insurance industry.

On top of this problem, there was no XBRL presentation component built into the system, undercutting the whole purpose of the effort, which was to enable more flexible reporting. And then there were problems with things such as getting the system to deal with data from more than a single year.

Ward also noted that ING discovered that its financial reporting structure was more complicated than what the vendors’ XBRL system could deal with. This was partly, once again, a limitation of what Ward characterized as “first generation” software tools, but was also an indication that ING needed to spend more time on up front analysis and design than had been expected.

In Mr. Ward’s words, the end result of all this was that the XBRL proof of concept project was “put on hold for further evaluation of the approach.”

This was a painful story, though an interesting and important one. Just as interesting was the response from the XBRL experts in the audience. They were particularly critical of ING’s reliance on its general ledger software vendor for the solution. In different ways, the XBRL experts said that there ARE firms that could help ING address the problems it wants to solve. What ING should have done, in their view, was work with the much smaller companies specializing in XBRL, rather than with its mainstream accounting system vendor.

This story and reaction took me back 20 years, to when some early SGML implementations were hard coded to a particular DTD and when the only really capable SGML vendors were the little, specialist companies. Then, as now, there were stores of failed projects, first generation software, and too much complexity.

Looking back on those SGML stories, it is clear to me now that the problem was NOT one of immature software or poor planning, but of a mismatch between buyer expectations and the state of that early market. On the vendor side (I was among them), we were releasing new products every six months–sometimes more often than that–and moving ahead quickly. Most of our resources were going into development, so there was no 24/7, worldwide customer support. Customers with problems got on the phone with a developer and we released new code that fixed the problem, and probably added a few new features while we were at it. We were the “go to” companies if you wanted the most flexible, cutting edge SGML solutions–much like the companies that the XBRL experts were recommending to ING.  We owned the market on the left side of the chasm.

But some of the companies and government agencies wanting to use SGML came to the market with very different expectations. They were accustomed to software products with worldwide 24/7 support, that were thoroughly tested and stable before commercial release. They were accustomed to systems supported by service organizations, where training was available, where new releases came regularly, with predictable results.  They were trying to buy the kinds of systems that you find only on the right side of the chasm.

ING’s hopes and plans, matched against the response from the XBRL audience last week, shows this same kind of straddle over the chasm. ING is a worldwide company, running 24/7, wanting a system to handle its most critical information. Why should it be surprising that ING sought to work with its principal software vendor, who offers worldwide support and professional services? How could ING reconcile the size and critical importance of its accounting problems with the capabilities of a 25 person XBRL firm?

ING’s problem — and XBRL’s problem — is that ING was wanting to build an Early Majority application with what is still primarily an Early Adopter’s technology.

Which is why the FDIC’s Call Report Modernization project is such an important example. It is an early bridge across the chasm that should be able to work because the problem can be constrained and because the need to change processes and software among users can be minimized. It is the kind of XBRL project that can be put to real use right now, generating real returns.

When you are looking at your own XBRL project, ask yourself whether it is more like the Call Modernization project–nicely constrained–or whether is is more the kind of broader, more mature project that ING hoped to undertake. If it is the latter, be careful. You should probably think in terms of a pilot project–something where the focus is more on learning, rather than operational outcomes.

One last thought — technology matures rapidly, particularly when it has as much energy and investment behind it as XBRL does. The constraints on this market that apply today could be very different in six months.  This is precisely why it is important to pay such careful attention to what is happening on either side of the chasm and to what is moving across it.

Typefi Announces Version 2.5 of the Typefi Publishing System

Typefi Systems, Inc. has announced the release of version 2.5 of its Typefi Publishing System (TPS). TPS version 2.5 provides new capabilities for automated scriptless publishing, workflow management and content management. TPS v2.5 uses intelligent layout technology to capture design and presentation rules and then automatically layout any combination of text, tables, illustrations, inline images, and other highly-formatted elements. The publishing system is compatible with Adobe InDesign CS and InDesign CS2. Scriptless composition allows designers to create templates within InDesign that are automated through the TPS Designer plug-ins. Using layout techniques and a high-speed batch composition engine, TPS can then use that template to produce thousands of print ready pages every hour, without custom code or scripting. At the heart of TPS is an XML-based content management system. With bi-directional converters to and from Word and InDesign to the central XML repository, TPS delivers XML content production without the endusers requiring any specific XML knowledge. All text and styling changes made to the typeset pages can be automatically extracted back to the central XML repository and out to all other editions, including the source Word documents. This ensures the next edition or the repurposed and archived content always mirrors that of the printed content while remaining in a form for re-use. Because the XML is synchronized across the system from source documents to output documents, document versions published to a web site, e-book or CD will always contain the last minute changes made to the printed pages. TPS is an integrated client/server solution that can be installed on a single workstation or across a network. All the components of TPS can run on both Microsoft Windows XP and Apple Macintosh OS X.

SealedMedia & Informative Graphics to Deliver Integrated Digital Rights Management

Informative Graphics Corp. (IGC) announced that SealedMedia Inc. is working with IGC to deliver an integrated E-DRM solution. The joint development project incorporates IGC’s content sealed format (CSF) with Visual Rights controls for CAD, image and non-Microsoft formats. SealedMedia will market the integration, scheduled for completion in Q4 2005. SealedMedia integrates with existing business systems to deliver complete protection of an organization’s digital information. It supports document formats such as e-mail (Microsoft Outlook, Lotus Notes and Novell GroupWise), Microsoft Word, Microsoft Excel, Microsoft PowerPoint, Adobe PDF and HTML, in addition to image, audio and video formats. IGC’s Brava! visualization and collaboration products, Net-It content publishing products, and ModelPress 3D publishing and viewing software support CAD/engineering and office formats, and are integrated with content management solutions such as Documentum, Open Text, and Microsoft SharePoint.,

CM on the Airwaves

My colleague Leonor Ciarlone and I will be guest speakers on a technology radio show hosted by on Tuesday, September 6, at 1:00 pm EDT. The broadcast is the last in a series of shows on content management. The series starts on August 2 and runs on six consecutive Tuesdays. Check out the complete broadcast schedule for more information.

Our topic is content management systems, the current market landscape for solutions, and buyer considerations when choosing technology. Other topics covered in the series include managing content as an asset, the role of CM professionals, and creating structured content.
Burning questions that we should address? Add a comment, or write directly to Leonor or to me.

XBRL and the Chasm

On Tuesday of last week XBRL-US sponsored a set of presentations in Washington, D.C. focused on “XBRL in Government and Industry.” The conference was hosted by the Federal Deposit Insurance Corporation (FDIC), which was appropriate since it was the FDIC that was the source of some of the most significant XBRL activity announced at the conference.

Here is the news: By October 1 of this year, the more than 8300 banks submitting Call Report data to the FDIC, the Federal Reserve System, and the Office of the Comptroller of the Currency will be required to do so using XBRL. Because most banks submit these reports through use of software and services supplied by a handful of vendors, this requirement will not bring about changes in the internal operations of most banks. The initiative does, however, represent a significant application of XBRL, and opens the door to greater reuse of data and simplification of workflows for other regulatory reporting requirements. It is also a good example of the kinds of broad improvement in financial information communication and processing that XBRL enables.

Under the current system, Call Reports are submitted in a number of formats, including physical transfer of magnetic media. The agencies currently spend a substantial amount of time converting these data. Data checking and validation is done only at the tail end of the process, within the government agencies, which means that when errors or omissions turn up, they must be communicated back to the banks so that corrections can be made. In the current operating model, the major component of the regulator’s investment in collecting and analyzing these data is spent in the conversion and validation process, leaving much less time for actual analysis.

The new system, implemented as part of an interagency consortium called the Federal Financial Institutions Examination Council (FFIEC), will allow the three agencies to share a single secure information repository. Each agency will use XBRL’s ability to provide a unique semantic identity for each incoming data element to enable the agency to extract the information it needs from the repository, in the form that it needs it.

Just as important, the use of XBRL coupled with Internet-based report submission allows the agencies to specify validation suites that can be run by the banks producing the information. The validation semantics are all expressed in XBRL, and so can be communicated in a reliable, standard way to the vendors, ensuring uniform performance across vendors and allowing the agencies to extend or improve validation without having to rely on the vendors. This is the kind of thing that XBRL excels at, and it plays an important role here. Moving validation closer to the source of the information will allow banks to catch their own errors, decreasing the time investment made by all parties in the submission process. The regulatory agencies will be able to spend less time on conversion, validation, and clean up and more time on analysis, all while reducing the cost of government operations.

This matter of the cost of data collection was a consistent theme across the presentations by government agencies at the conference. In general, financial and statistical information is currently heterogeneous in format, often with the added problem of having low information value. The numbers are all present in a report, but before the advent of XBRL there has been  no easy, consistent, reliable way to enable a machine to know what the numbers represent. As one example of the cost of the current way of doing business, Mark Carney of the Commodities and Futures Trading Commission spoke in terms of spending over 80 cents of every dollar invested in data collection on conversion. This is precisely the kind of problem that XBRL is designed to solve.

For more information on the FFIEC’s Call Report Modernization program and about the creation and operation of this Central Data Repository, see

In my view, there are a number of interesting facts and associated implications about this Call Report initiative. The first is that this is a mandatory initiative. XBRL is not an option, it is a requirement. XBRL has reached a level of maturity and universality sufficient to allow a set of government agencies to make it the required standard for an important application.

A related observation is that this maturity and universality has developed outside the U.S., and is only now coming back into this country of XBRL’s origin. Mike Bartell, CIO for the FDIC, made the point this way: “If we are truly serious about disclosure and transparency, we need to move aggressively toward the adoption of XBRL. The U.S. has not been a leader in this area. It is time to step up the pace. We are buried in data, and we are not making better decisions because of that.”

It is also important to note that this early, significant development in the marketplace for XBRL tools and services is the kind of “early majority,” “pragmatist” application that Geoffrey Moore described so well in his classic marketing text, Crossing the Chasm. The pragmatists are the first technology adopters on the right side of the chasm, after crossing it. They seek narrowly focused solutions that solve specific, pressing problems, adopting a new technology when it is the only thing that will do the job. The payoff must be clear and substantial–no pie in the sky or grand visions here–and the costs must be easily contained. Pragmatists adopt new technology because it will be a sure win for a particular problem, not because it will change the world.

In this case, where the actual XBRL will produced by a handful of vendors, it is easy to see how to contain the costs associated with XBRL production.  Most banks won’t know and won’t care that the call reports are being submitted in XBRL. Further, the payoff, in terms of reduced conversion and validation costs, is clear and compelling. The fact that XBRL is an open, international standard adds to its pragmatist appeal. It is exactly the right tool for this niche application.

Getting an application across the chasm–away from the early adopters and technology aficionados and over on the other side, where the mainstream applications live–is a big step. This is real evidence of the utility and potential impact of XBRL. It is also important to remember that such early leaps across the chasm are always, of necessity, narrow, tailored, niche applications. It is the narrow applications that get across first. This is an important one. What comes next?

« Older posts

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑