Curated for content, computing, and digital experience professionals

Month: September 2009 (Page 3 of 4)

Reflections on Gov 2.0 Expo and Summit

O’Reilly’s Gov 2.0 events took place last week. We’ve had some time to think about what the current wave of activity means to buyers and adopters of content technologies.

Both the Expo and Summit programs delivered a deluge of examples of exciting new approaches to connecting consumers of government services with the agencies and organizations that provide them.

  • At the Expo on Sept 8,  25 speakers from organizations like NASA, TSA, US EPA, City of Santa Cruz,  Utah Department of Public Safety, and the US Coast Guard provided five-minute overviews of their 2.0 applications in a sometimes dizzying fast-paced format.
  • Sunlight Labs sponsored an Apps for America challenge that featured finalists who combined federal content available on Data.gov and open source software in some intriguing applications, including DataMasher, which enables you to mash up sources such as stats on numbers of high school graduates and guns per household.
  • The Summit on Sept 9 and 10 featured more applications plus star-status speakers including Aneesh Chopra, the US’s first CTO operating under the Federal Office of Science and Technology Policy; Vinton Cerf, currently VP and evangelist at Google; and Mitch Kapor.

A primary program theme was “government as platform,” with speakers suggesting and debating just what that means. There was much thoughtful discussion, if not consensus. Rather than report, interested readers can search Twitter hash tags #gov20e and #gov20s for comments.

From the first speaker on, we were immediately struck by the rapid pace of change in government action and attitude about content and data sharing. Our baseline for comparison is Gilbane’s last conference on content applications within government and non-profit agencies in June 2007. In presentations and casual conversations with attendees, it was clear that most organizations were operating as silos. There was little sharing or collaboration within and among organizations. Many attendees expressed frustration that this was so. When we asked what could be done to fix the problem, we distinctly remember one person saying that connecting with other content managers just within her own agency would be a huge improvement.

Fast forward a little over two years to last week’s Gov2.0 events. Progress towards internal collaboration, inter-agency data sharing, and two-way interaction between government and citizens is truly remarkable. At least three factors have created a pefect storm of conditions: the current administration’s vision and mandate for open government, broad acceptance of social interaction tools at the personal and organizational level, and technology readiness in the form of open source software that makes it possible to experiment at low cost and risk.

Viewing the events through Gilbane’s content-centric lens, we offer three takeaways:

  • Chopra indicated that the formal Open Government directives to agencies, to be released in several weeks, will include the development of “structured schedules” for making agency data available in machine-readable format. As Tim O’Reilly said while interviewing Chopra, posting “a bunch of PDFs” will not be sufficient for alignment with the directives. As a result, agencies will be accelerating the adoption of XML and the transformation of publishing practices to manage structured content. As a large buyer of content technologies and services, government agencies are market influencers. We will be watching carefully for the impact of Open Government initiatives on the broader landscape for content technologies.
  • There was little mention of the role of content management as a business practice or technology infrastructure. This is not surprising, given that Gov2.0 wasn’t about content management. And while the programs comprised lots of show-and-tell examples, most were very heavy on show and very light on tell. But it does raise a question about how these applications will be managed, governed, and made sustainable and scalable. Add in the point above — that structured content will now be poised for wider adoption, creating demand for XML-aware content management solutions. Look for more discussion as agencies begin to acknowledge their content management challenges.
  • We didn’t hear a single mention of language issues in the sessions we attended. Leaving us to wonder if non-native English speakers who are eligible for government services will be disenfranchised in the move to Open Government.

All in all, thought-provoking, well-executed events. For details, videos of the sessions are available on the Gov2.0 site.

Component Content Management and the Global Content Value Chain: An Interview with Suzanne Mescan of Vasont

First in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

Recently we had an opportunity to catch up with Suzanne Mescan, Vice President of Marketing for Vasont Systems. Vasont is a leading provider of component content management systems built upon XML standards. Suzanne spoke with us about the global content value chain (GCVC) and important findings from the research.

 Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?)  
 
Mescan: We are the “manage” phase of the GCVC, providing component content management solutions that include multiple automatic and user-defined content reuse capabilities, project management, built-in workflow, integrated collaborative review, translation management, support for any DTD, and much more.
 
Gilbane: Why did you choose to sponsor the Gilbane research? 
 
Mescan: As part of the GCVC, we felt it was important for us and for those organizations looking to change and enhance their product content strategies to understand the positive trends and direction of the industry from beginning to end. Being a sponsor enabled this research to take place through The Gilbane Group, a group who has the pulse of this space in the industry.
 
Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?
 
Mescan: The most interesting result in the report was that terminology management ranked highest in the approach to standardization of content creation and that this terminology management is still a manual process based on a spreadsheet for half of the respondents. Yet “paper-based style guidelines and glossaries did little to encourage real adoption.” Being a key to global customer experience, brand management, and quality and consistency to 80% of the respondents, it is surprising that terminology management, as well as other content creation standardization practices, is still such a manual process.
 
For more about current terminology management practices, see “Achieving Quality at the Source” on page 28 of the Gilbane report. You can also read about how Vasont customer Mercury Marine is deploying content management as part of its global content value chain. Download the study for free.  

Ecordia Releases Content Analysis Tool for Search Engine Optimization

Ecordia has announced the availability of its new predictive content analysis application, the Ecordia Content Optimizer. Designed for copywriters, journalists, and SEO practitioners, this content analysis application provides automated intelligence and recommendations for improving the structure of content prior to publishing. Available for free, this turn-key web application provides a number of features to aid writers in the creation and validation of content including: advanced keyword research during authoring; detailed scoring of your content based on 15 proven SEO techniques; automated recommendations on how you can improve your content for search engines; intelligent keyword extraction that compares your content to popular search terms; sophisticated Keyword Analysis that scores your keyword usage based on 5 statistical formulas. The Ecordia Content Optimizer has been in beta development for over a year and is currently in use by a number of SEO practitioners. The Ecordia Content Optimizer provides content analysis capabilities ideally suited for web publishers who wish to: improve their quality score for landing pages used in PPC campaigns; SEO professionals that want to validate and review content prior to publishing; blog sites that wish to improve the quality of their ads from contextual ad networks; and PR Practitioners that want to optimize their press release prior to publishing. The Ecordia Content Optimizer is licensed on a per user monthly subscription. http://www.ecordia.com/

Avantstar Releases Transit Solutions 9

Avantstar, Inc.has announced the release of Transit Solutions 9, file conversion and web publishing software that takes content originally designed for print or offline use, such as policies and procedures information, and publishes it as fully formatted web content. Transit Solutions is designed to help IT departments and web content managers keep up with the high volume of website changes generated everyday by not only automatically converting and publishing hundreds of file types as HTML, but also recognizing and automatically linking within documents (not just among them). Transit also publishes entire information folders on a preset schedule. New features in Transit Solutions 9 include: a redesigned Edit Template window that combines preview, publication hierarchy and properties into one window, making it easier to access commonly used features to edit web page templates; enhanced support for embedded content in Microsoft Word 2007, Excel 2007 and PowerPoint 2007; support for merged cells in Excel and tables within tables; as well as enhanced conversion support for Star Office and Open Office suites. http://www.avantstar.com

Conversations with Globalization Solution Providers

The research for Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains was supported by seven companies with proven track records in content globalization. Their technologies and services are used by market-leading companies to create competitive advantage with multilingual content.

One of the goals of this blog is to provide buyers and adopters with a variety of perspectives on content globalization strategies, practices, and solutions. The Multilingual Product Content study is authored from our own analyst perspective, drawing on the results of research. The user perspective is captured in the profiles included in the report; they describe the global content value chains deployed at Adobe, BMW Motorrad, Cisco Systems, HP, Mercury Marine, and New York City Department of Education.

To bring the solution supplier perspective into the mix, over the next month or so we’ll publish a series of brief interviews with study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR Group, Systran, and Vasont Systems. A representative from each company answers three questions:

  1. What role does your company play in the global content value chain?
  2. Why did you elect to sponsor Gilbane’s research?
  3. What was the most compelling or interesting result to come out of the research?

Readers will be able to comment on the interviews and ask questions of the supplier. We’ll post answers that are appropriate for sharing.

Our first interview with Suzanne Mescan from Vasont will be published next week.

MadCap Lingo 3.0 for Authors and Translators Released

MadCap SoftwareExternal link has announced that MadCap Lingo 3.0 is now available. MadCap LingoExternal link, the XML-based, fully integrated translation memory system (TMS) and authoring tool solution, eliminates the need for file transfers in order to complete translation-preserving valuable content and formatting to deliver a consistent experience across multiple languages. With version 3.0, MadCap Lingo adds a new Project Packager function that bridges the gap between authors and translators who use other TMS software. Using the Project Packager in MadCap Lingo, authors should be able to work with translators to streamline the translation process, track the status of completion, and obtain more accurate project cost estimates. MadCap Lingo 3.0 also features a new TermBase Editor for creating databases of reusable translated terms, and enhanced translation memory. Through integration between MadCap Lingo and MadCap’s authoring and multimedia applications, MadCap hopes to offer a powerful integrated authoring and localization workflow. Project Packager in MadCap Lingo 3.0 is designed to make it easier for authors who need their documentation translated into another language but work with a translator who relies on a TMS tool other than MadCap Lingo. Using Project Packager, the author can create a MadCap Lingo project with all the files that require translation, and bundle it in a ZIP file and send it to the translator. MadCap Lingo displays a list of all files that need to be translated, going beyond text to include skins, glossaries, search filter sets, and much more. As a result, the author can ensure that the translator receives all of the files requiring translation. This should streamline the process while enabling more accurate translation project estimates, helping translators to avoid accidentally underestimating project costs based on an incomplete file count-and protecting authors from unexpected cost overruns. Once the translation is complete, the translator sends a ZIP file with the content. The author then simply merges the translated file in MadCap Lingo, which is used to confirm the completeness of the translation. The author can then run statistical reports showing information for each project and file to determine what has/ has not been translated, how many words/segments have been translated and/or still need to be translated, and much more. The author can then export the MadCap Lingo project to a range of outputs, such as a Flare project file for online and print publishing, Word document, or even a Darwin Information Typing Architecture (DITA) file, among others. The key new features of MadCap Lingo 3.0 are: the new TermBase Editor which enables translators to create and manage concept-oriented, multilingual terminology databases, “termbases,” making it significantly easier to reuse translated terms; the ability to import and export Term Base eXchange (TBX) files, an open, XML-based standard used for exchanging structured terminological data; translation memory – Apply Suggestions to Project function, which makes it possible to view and automatically apply translation memory suggestions to an entire project, rather than just one segment, saving hours of effort; dynamic help window pane lock lets the translator keep the current help topic frozen in place while moving around in the MadCap Lingo interface, making it easier to follow steps or other information placed in the Help topic; minimize to system tray option; multiple file support allows multiple files to be selected when creating a new MadCap Lingo project, for example HTM, HTML, XML, DITA or DOC files. http://www.madcapsoftware.com/

Open Text Outlines Vignette Strategy and Product Plans

Open Text Corporation has announced plans to expand its suite of Web solutions with products and technologies from recently acquired Vignette playing a central role. Offered as part of the Open Text ECM Suite, the solutions should help organizations establish deeper connections to customers and use the Web as a channel for new revenues. As part of this strategy, Open Text will leverage technologies from both Vignette and its existing solutions to deliver new next- generation Web offerings. Vignette Content Management will form the foundation of Open Text’s Web business solutions, providing personalized and multi-channel capabilities integrated with the Open Text ECM Suite.  Open Text has said it will continue to develop both Vignette Content Management and Open Text Web Solutions as complementary offerings to meet the full range of WCM needs. Open Text will release Vignette Content Management version 8.0 and Vignette Portal version 8.0 in the second half of 2009. The recently released Web Solutions 10.0 will be followed by the release of Web Solutions 10.1 in the first half of 2010. Within 24 months, Open Text will launch an offering that combines key strengths of Web Solutions with the Vignette Content Management platform. Vignette Social Media Solutions (Community Applications and Community Services) will be the basis of a new Social Marketplace offering, which will be added to Open Text’s Social Media solutions, targeting Internet-style social media interactions and enabling existing Web sites with social content. Current Open Text WCM customers will be able to take advantage of this new Social Marketplace package. In addition, Vignette Collaboration will continue to be enhanced as part of the underlying Vignette Social Media technology stack. http://www.opentext.com/

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑