Archive for January 2009

XML in Everyday Things

If you didn’t follow the link below to Bob DuCharme’s response to my January 13 posting on Why it is Difficult to Include Semantics in Web Content, you should read it. Bob does a great job describing tools in use to include semantics in Web content. Bob is a very smart guy. I like to think the complexity of his answer is a good illustration of my point that adding semantics is not easy. Anyway, his response is clearly worth reading and can be found at http://www.snee.com/bobdc.blog/2009/01/publishers-and-semantic-web-te.html.

Also, I have known Bob for some time. I am reminded that a while back he wrote an interesting article about XML data produced by his TiVo device (see http://www.xml.com/pub/a/2006/02/15/hacking-the-xml-in-your-tivo.html). I was intrigued how XML had begun to pop up in everyday things.

Ever since that TiVo article, I think of Bob every time XML pops up in unexpected everyday places (it’s better than associating him with a trauma). Once in a while I get a glimpse of XML data in a printer control file, in Web page source code, or as an export format for some software, but that sort of thing is to be expected. We all have seen examples at work or in commercial settings, but to find XML data at home in everyday devices and applications has always warmed my biased heart.

Recently I was playing a game of Sid Meier’s Civilization IV (all work and no play and so on….) and I noticed while it was booting up a game that one of the messages said "Reading XML FIles". My first thought was "Bob would like to see this!" Then I was curious to see how XML was being used in game software. A quick Google search and the first entry, from Wikipedia (http://en.wikipedia.org/wiki/Civilization_IV#cite_note-10), says "More game attributes are stored in XML files, which must be edited with an external text editor or application." Apparently you can "tweak simple game rules and change or add content. For instance, they can add new unit or building types, change the cost of wonders, or add new civilizations. Players can also change the sounds played at certain times or edit the play list for your soundtrack."

I poked around in the directories and found schemas describing game units, events, etc. and configuration data instances describing artifacts and activities used in the game. A user could, if they wanted to, make buying a specific building very cheap for instance, or have the game play their favorite music instead of what comes with the game. That is if they know how to edit XML data. I think I just found a way to add many hours of enjoyment to an already great game.

I wonder how much everyday XML is out there just waiting for someone to tweak it and optimize it to make something work better. A thermostat, a refrigerator, or a television perhaps. </>

Podcast on Structured Content in the Enterprise

Traditionally, the idea of structured content has always been associated with product documentation, but this is beginning to change. Featuring Bill Trippe, Lead Analyst at The Gilbane Group, and Bruce Sharpe, XMetaL Founding Technologist at JustSystems, a brand new podcast on The Business Value of Structured Content takes a look into why many companies are beginning to realize that structured content is more than just a technology for product documentation – it’s a means to add business value to information across the whole enterprise. 

From departmental assets such as marketing website content, sales training materials, or technical support documents, structured content can be used to grow revenue, reduce costs, and mitigate risks, ultimately leading to an improved customer experience.  

Listen to the podcast and gain important insight on how structured content can

  • break through the boundaries of product documentation
  • help organizations meet high user expectations for when and where they can access content
  • prove to be especially valuable in our rough economic times
  • …and more!

Gilbane San Francisco pre-conference workshops posted

The main conference program will be published in a week or two, but the 1/2 day pre-conference workshop descriptions for June 2nd have been posted at: .
How to Select a Web Content Management System
Instructor: Seth Gottlieb, Principal, Content Here
Making SharePoint Work in the Enterprise
Instructor: Shawn Shell, Principal, Consejo, Inc.
Managing the Web: The Fundamentals of Web Operations Management
Instructor: Lisa Welchman, Founding Partner, Welchman Pierpoint
Getting Started with Business Taxonomy Design
Instructors: Joseph A. Busch, Founder and Principal, & Ron Daniel, Principal, Taxonomy Strategies LLC
Sailing the Open Seas of New Media
Instructor: Chris Brogan, President, New Marketing Labs, LLC

Webinar: Making the Business Case for SaaS WCM

Updated April 9, 2009: View the recorded webinar.
January 27, 2009, 2:00 pm ET
When customer experience becomes increasingly important even as budgets are tightening, the SaaS value proposition–faster time to results, reduced dependency on IT resources, predictable costs–can be especially compelling. If your organization wants or needs to move ahead with web business initiatives in today’s uncertain economic climate, you’re probably investigating software-as-a-service solutions for web content management.
But SaaS WCM is fundamentally different from licensing software (open source or proprietary) and installing it on your own servers. Which means the process of evaluating solutions is different. It’s not all apples when SaaS is on the short list, but rather apples and oranges.This webinar explores the implications for technology acquisition. How do you make a business case that enables your organization to fairly evaluate all options and make the best decision for the business?
Join us in a lively discussion with Robert Carroll from Clickability. Register today. Presented by Gilbane. Sponsored by Clickability. Based on a new Gilbane Beacon entitled Communicating SaaS WCM Value.

Open Government Initiatives will Boost Standards

Following on Dale’s inauguration day post, Will XML Help this President?,  we have today’s invigorating news that President Obama is committed to more Internet-based openness. The CNET article highlights some of the most compelling items from the two memoes, but I am especially heartened by this statement from the memo on the Freedom of Information Act (FOIA):

I also direct the Director of the Office of Management and Budget to update guidance to the agencies to increase and improve information dissemination to the public, including through the use of new technologies, and to publish such guidance in the Federal Register.

The key phrases are "increase and improve information dissemination" and "the use of new technologies." This is keeping in spirit with the FOIA–the presumption is that information (and content) created by or on behalf of the government is public property and should be accessible to the public.  This means that the average person should be able to easily find government content and be able to readily consume it–two challenges that the content technology industry grapples with every day.

The issue of public access is in fact closely related to the issue of long-term archiving of content and information. One of the reasons I have always been comfortable recommending XML and other standards-based technology for content storage is that the content and data would outlast any particular software system or application. As the government looks to make government more open, they should and likely will look at standards-based approaches to information and content access.

Such efforts will include core infrastructure, including servers and storage, but also a wide array of supporting hardware and software falling into three general categories:

  • Hardware and software to support the collection of digital material. This ranges from hardware and software for digitizing and converting analog materials, software for cataloging digital materials with the inclusion of metadata, hardware and software to support data repositories, and software for indexing the digital text and metadata.
  • Hardware and software to support the access to digital material. This includes access tools such as search engines, portals, catalogs, and finding aids, as well as delivery tools allowing users to download and view textual, image-based, multimedia, and cartographic data.
  • Core software for functions such as authentication and authorization, name administration, and name resolution.

Standards such as PDF-A have emerged to give governments a ready format for long-term archiving of routine government documents. But a collection of PDF/A documents does not in and of itself equal a useful government portal. There are many other issues of navigation, search, metadata, and context left unaddressed. This is true even before you consider the wide range of content produced by the government–pictorial, audio, video, and cartographic data are obvious–but also the wide range of primary source material that comes out of areas such as medical research, energy development, public transportation, and natural resource planning.

President Obama’s directives should lead to interesting and exciting work for content technology professionals in the government. We look forward to hearing more.

Will XML Help this President?

I’m watching the inauguration activity today all day (not getting much work done) and getting caught up in the optimism and history of it all. And what does this have to do with XML you ask? It’s a stretch, but I am giddy from the festivities, so bare with me please. I think there is a big role for XML and structured technologies in this paradigm shift, albeit XML will be quietly doing it’s thing in the background as always.

In 1986, when SGML, XML’s precursor, was being developed, I worked for the IRS in Washington. I was green, right out of college. My Boss, Bill Davis, said I should look into this SGML stuff. I did. I was hooked. It made sense. We could streamline the text applications we were developing. I helped write the first DTD in the executive branch (the first real government one was the ATOS DTD from the US Air Force, but that was developed slightly before the SGML standard was confirmed, so we always felt we were pretty close to creating the actual first official DTD in the federal government). Back then we were sending tax publications and instructions to services like CompuServe and BRS, each with their own data formats. We decided to try to adopt structured text technology and single source publishing to make data available in SGML to multiple distribution channels. And this was before the Web.  That specific system has surely been replaced, but it saved time and enabled us to improve our service to taxpayers. We thought the approach was right for many govenrment applications  and should be repeated by other agencies.

So, back to my original point. XML has replaced SGML and is now being used for many government systems including electronic submission of SEC filings, FDA applications, and for the management of many government records. XML has been mentioned as a key technology in the overhaul that is needed in the way the government operates. Obama also plans to create a cabinet level position of CTO, part of the mission of which will be to promote inter-agency cooperation through interchange of content and data between applications formatted in a common taxonomy. He also intends to preserve the open nature of the internet and its content, facilitate publishing important government information and activities on the Web in open formats, and to enhance the national information system infrastructure. Important records are being considered for standardization, such as health and medical records, as well as many other ways we interact with the government. More info on this administration’s technology plan can be found at . Sounds like a job, at least in part, for XML!

 I think it is great and essential that our leaders understand the importance of smartly structured data. There is already a lot of XML expertise through the various government offices, as well as a strong spirit of corporation on which we can build. Anyone who has participated in industry schema application development, or other common vocabulary design efforts, knows how hard it is to create a “one-size-fits-all” data model. I was fortunate enough to participate briefly in the development and implementation of SPL, the Standard Product Label (see http://www.fda.gov/oc/datacouncil/spl.html) schema for FDA drug labels which are submitted to the FDA for approval before the drug product can be sold. This is a very well defined document type that has been in use for years. It still took many months and masterful consensus building to finalize this one schema. And it is just one small piece in the much larger information architecture.  It was a lot of effort from many people within and outside the government.  But now it is in place, working and being used.

So, I am bullish on XML in the government these days. It is a mature, well understood, powerful technology with wide adoption, there are many established civilian and defense  examples across the government. I think there is a very big role for XML and related technology in the aggressive, sweeping change promised by this administration. Even so, these things take time. </>

Forrester on Community Platforms

Forrester Sr. Analyst Jeremiah Owyang discusses the findings of their latest report on community platforms, “Forrester Wave: Community Platforms, Q1 2009″ on his blog. He also provides a lot of information about their methodology, including how they reduced the number companies to include from 100 to 9. The full report is only for Forrester clients, but Jeremiah provides a summary which you can read here. Here’s a snip from his post:

What did we find? First of all, this is still a very young market, with the average tenure of a company being just a few years in community. Despite the immaturity, we evaluated nine and were impressed with Jive Software and Telligent Systems who lead the pack because of their strong administrative and platform features and solution offerings.

Next, a group of vendors ranked as strong performers: KickApps and Pluck enable large Web sites to quickly scale with social features. Also in the strong performer category, Awareness, Lithium Technologies, and Mzinga enable brands to build branded communities while LiveWorld offers brands agency-like services. While Leverage Software is not on par with the others in the category, they are ideal for medium-sized businesses and due to their cost-effective platform could have a strong position during this economic downturn.

Webinar: Ingersoll Rand, Club Car’s Strategy for Multilingual Product Documentation

Tuesday, Febuary 3rd, 2009: 11am EST / 10am CST / 8am PST
In the manufacturing industry, the pace of innovation in multinational product design and engineering can create a gulf between product availability and multilingual product documentation delivery. The result can negatively affect customer satisfaction, regulatory compliance programs, and global perception of product quality.

In this webinar, you’ll learn how the technical publications group at Ingersoll Rand, Club Car has closed this gap by:

  • Introducing manufacturing innovation into technical publications processes.
  • Collaborating with sales support to maintain and increase customer satisfaction.
  • Automating links between authoring, localization/translation, and publishing with technologies such as XML and translation memory.
  • Increasing the volume of multilingual product documentation without raising costs.

Join us to hear first-hand experience and best practices advice from Jeff Kennedy, Manager of Engineering Information and Systems at Ingersoll Rand, Club Car. Joined by Gilbane Senior Analyst Karl Kadie and Sajan Chief Marketing Officer Vern Hanzlik, this webinar discussion is a companion to Gilbane’s Club Car case study.

Register today. Moderated by Gilbane Group. Hosted by Sajan.