The W3C’s Web Performance Working Group is working on a specification to define 20 “fine-grained” metrics to measure the duration of just about every aspect of a web user’s navigation behavior. The W3C’s working draft of the Navigation Timing Specification is in the “last call for comments” phase. After being finalized, it will specify 20 measurements for every page visited. http://test.w3.org/webperf/specs/NavigationTiming/
Language afterhought syndrome refers to that pattern of treating language requirements as secondary considerations within content strategies and solutions. Global companes leak money and opportunity by failing to address language issues as integral to end-to-end solutions rather than ancillary post-processes. Examples abound. Source and translated content that should be reusable, but isn’t. Retrofitting content to meet regulatory requirments in different regions. Lost revenue because product and marketing content isn’t ready at launch time. Desktop publishing costs that are incurred soley due to reformatting in multiple languages. The list goes on and on.
One of the most effective defenses against language afterthought syndrome is baking language requirements into the technology acquisition process, thereby embedding support into the infrastructure as it’s designed, developed, and built out. OCLC (Online Computer Library Center) recognized this opportunity when it embarked on an ambitious transformation of its web content globalization practices. Debra Lewis, web content manager at OCLC, and our friend Andrew Lawless, principal at Dig-IT Consulting, shared their experiences in a terrific session at Gilbane Boson 2010 entitled “Next Thing You Know — You’re Global!”
The presentation delivered by Deb and Andrew is available on the Gilbane conference website (follow the link and click on slides for session E3) . Highlights include Deb’s characterization of the signs of stress. On the production side:
Spend more time finding “creative solutions” than creating new content or managing site strategy
Use features of your CMS in ways not originally intended
Can’t upgrade to new releases without corrupting your pages
On the business side:
Localization addressed at the point of publication
Turnaround for day-to-day edits increases—affects relationships with internal clients
Distributed authors “give up” and relinquish editing rights
Team stress increases
These stress points led OCLC to commit resources to evolving its global web content strategy. Deb and Andrew then walked our audience through OCLC’s three-phased transformation:
Get a translation service provider
Get a new CMS that would scale
Get a translation management system
The portion of the presentation on selecting a web CMS with well-defined multilingual requirements will be especially valuable to any organization wanting to eliminate the negative impacts of language afterthought syndrome. Deb and Andrew described OCLC’s selection process and timeline, CMS selection criteria, prioritized globalization features, key standards that would need to be supported, text and language requirements, and requirements for integration with translation workflows.
Many global companies are now rearchitecting their web strategies for global presence and audience engagement. We see this as a major technology and investment trend for 2011. The insight offered by OCLC couldn’t be more timely.The organization’s experience offers a treasure trove of guidance for companies who are evaluating new web content management systems with language requirements among their priorities.
Thanks to Deb and Andrew for a great contribution to Gilbane Boston.
Hosted by Kodak, this event was lively and informative.
As Kodak noted in their announcement:
"The future of book publishing is irrevocably changing. With the advent of e-books and other ongoing changes in the retail marketplace, the ability to print books as efficiently as possible becomes even more important. Book publishers, manufacturers, authors, distributors and other key stakeholders in the book value chain are all impacted by the industry’s fast-changing business environment, all seeking to improve efficiencies and develop new markets and revenue opportunities."
Alfresco announced the immediate availability of Alfresco Enterprise 3.4 for download. This release features a more robust content platform for building content-rich applications, along with a more social user-interface for collaboration and document management. This platform will be used by developers and companies to build applications where enterprise content is “social-ready” — or shared, collaborated on and syndicated across the web – while being captured for compliance, retention and control. Using open standards like CMIS & JSR-168, Alfresco Enterprise 3.4 is a content platform that can co-exist with social business systems to help manage and retain the social content. Key new product capabilities for the Alfresco Enterprise 3.4 release include: User-interface enhancements to make document management more social; Folder-based actions for simple workflow, along with advanced workflow (using jBPM); Distributed Content Replication; Collaborative Web Authoring; Integration with Enterprise Portals and Social Software. The Alfresco Enterprise 3.4 social content management platform is available now for download. http://www.alfresco.com/
The gradual upturn from the worst economic conditions in decades is reason for hope. A growing economy coupled with continued adoption of enterprise software, in spite of the tough economic climate, keep me tuned to what is transpiring in this industry. Rather than being cajoled into believing that “search” has become commodity software, which it hasn’t, I want to comment on the wisdom of Jill Dyché and her Anti-predictions for 2011 in a recent Information Management Blog. There are important lessons here for enterprise search professionals, whether you have already implemented or plan to soon.
Taking her points out of order, I offer a bit of commentary on those that have a direct relationship to enterprise search. Based on past experience, Ms. Dyché predicts some negative outcomes but with a clear challenge for readers to prove her wrong. As noted, enterprise search offers some solutions to meet the challenges:
No one will be willing to shine a bright light on the fact that the data on their enterprise data warehouse isn’t integrated. It isn’t just the data warehouse that lacks integration among assets, but among all applications housing critical structured and unstructured content. This does not have to be the case. Several state-of-the-art enterprise search products that are not tied to a specific platform or suite of products do a fine job of federating indexing of disparate content repositories. In a matter of weeks or few months, a search solution can be deployed to crawl, index and search multiple sources of content. Furthermore, newer search applications are being offered for pre-purchase testing for out-of-the-box suitability in pilot or proof-of-concept (POC) projects. Organizations that are serious about integrating content silos have no excuse for not taking advantage of easier to deploy search products.
Even if they are presented with proof of value, management will be reluctant to invest in data governance. Combat this entrenched bias with a strategy to overcome lack of governance; a cost cutting argument is unlikely to change minds. However, risk is an argument that will resonate, particularly when bolstered with examples. Include instances when customers were lost due to poor performance or failure to deliver adequate support services, sales were lost because answers to qualifying questions could not be answered or were not timely, legal or contract issues could not be defended due to inaccessibility of critical supporting documents, or when maintenance revenue was lost due to incomplete, inaccurate or late renewal information getting out to clients. One simple example is the consequences of not sustaining a concordance of customer name, contact, and address changes. The inability of content repositories to talk to each other or aggregate related information in a search because a Customer labeled as Marion University at one address is the same as the Customer labeled University of Marion at another address will be embarrassing in communications and, even worse, costly. Governance of processes like naming conventions and standardized labeling enhances the value and performance of every enterprise system including search.
Executives won’t approve new master data management or business intelligence funding without an ROI analysis. This ties in with the first item because many enterprise search applications include excellent tools for performing business intelligence, analytics, and advanced functions to track and evaluate content resource use. The latter is an excellent way to understand who is searching, for what types of data, and the language used to search. These supporting functions are being built into applications for enterprise search and do not add additional cost to product licenses or implementation. Look for enterprise search applications that are delivered with tools that can be employed on an ad hoc basis by any business manager.
Developers won’t track their time in any meaningful way. This is probably true because many managers are poorly equipped to evaluate what goes into software development. However, in this era of adoption of open source, particularly for enterprise search, organizations that commit to using Lucene or Solr (open source search) must be clear on the cost of building these tools into functioning systems for their specialized purposes. Whether development will be done internally or by a third party, it is essential to place strong boundaries around each project and deployment, with specifications that stage development, milestones and change orders. “Free” open source software is not free or even cost effective when an open meter for “time and materials” exists.
Companies that don’t characteristically invest in IT infrastructure won’t change any time soon. So, the silo-ed projects will beget more silo-ed data…Because the adoption rate for new content management applications is so high, and the ease for deploying them encourages replication like rabbits, it is probably futile to try to staunch their proliferation. This is an important area for governance to be employed, to detect redundancy, perform analytics across silos, and call attention to obvious waste and duplication of content and effort. Newer search applications that can crawl and index a multitude of formats and repositories will easily support efforts to monitor and evaluate what is being discovered in search results. Given a little encouragement to report redundancy and replicated content, every user becomes a governor over waste. Play on the natural inclination for people to complain when they feel overwhelmed by messy search results, by setting up a simple (click a button) reporting mechanism to automatically issue a report or set a flag in a log file when a search reveals a problem.
It is time to stop treating enterprise search like a failed experiment and instead, leverage it to address some long-standing technology elephants roaming around our enterprises.
Adobe Systems Incorporated announced Adobe Technical Communication Suite 3, the latest version of its single-source authoring and multi-device publishing toolkit for the creation and publication of standards-compliant technical information and training material. The new improved version of Adobe’s suite enables technical writers, help authors and instructional designers to author, enrich, manage, and publish content to multiple channels and devices. Adobe also introduced new versions of the suite’s core components: Adobe FrameMaker 10, a template-based authoring and publishing solution for technical content; and Adobe RoboHelp 9, an HTML and XML help, policy and knowledgebase authoring and publishing solution. Adobe Photoshop CS5, Adobe Captivate 5 and Adobe Acrobat X Pro round out the suite, integrating image editing, eLearning and demo creation, and dynamic PDF functionalities. New Features in Technical Communication Suite 3: Import FrameMaker content into RoboHelp with support for FrameMaker books. Directly link DITA (Darwin Information Typing Architecture) maps, automatically convert table and list styles, and publish multiple RoboHelp outputs from within the native authoring environment. Dynamic “single-click” publishing: Create standards-compliant XML and DITA (1.2) content and output to multiple formats, including print, PDF, Adobe AIR, WebHelp, EPUB, XML and HTML, and deliver it to a wide range of mobile devices, such as eReaders, smartphones and tablets. Lend your content to search engine optimization, via enhanced metadata tagging of published content. Expanded multimedia capabilities: Take advantage of more than 45 video and audio formats and engage audiences by adding 3D models, training demos and simulations. FrameMaker 10 Standards support: Take advantage of significantly enhanced XML/DITA authoring capabilities of FrameMaker 10, which is an early adopter of industry standards including DITA 1.2. Usability enhancements: Work with standards-compliant, prebuilt tools and templates designed for easier authoring. Use utilities like Auto Spell Check, Highlight Support, scrolling for lengthy dialogue, and enhanced Find and Replace. Content Management System (CMS) connectors: Integrate seamlessly with leading content management systems, including Documentum and MS SharePoint, included in FrameMaker 10 at no additional cost. The new offering enables enterprises to better streamline publishing workflows while reducing localization costs by leveraging the enhanced SDL Author Assistant in FrameMaker 10. Users can also automatically schedule and publish content to multiple channels and screens, and gain analytical insights into content usage for effective optimization. http://www.adobe.com/
Gilbane’s webinar calendar was laden with at-your-desk educational opportunities during the final quarter of 2010. Here’s a quick round-up of the events on content globalization:
Cisco’s Localization Journey: Capitalizing on Global Opportunity. We talked with Tim Young, Senior Operations Manager at Cisco, about the company’s transition from localization and translation silos to a centralized shared services platform. Young’s presentation was chock full of great metrics. Gilbane will publish an in-depth case study in February.
Game-Changing Approaches to Engaging Global Audiences and Managing Brand. The online version of our presentation at Localization World in Seattle. We shared insights into how leading practitioners are improving and advancing their global content value chains for marketing content, drawing on the research for our upcoming report on multilingual marketing content:
And although this webinar on Delivering Compelling Customer Experiences with DITA and CCM wasn’t specifically about content globalization, it examined next-generation XML applications and how global companies are realizing new value with smart content. The case studies covered in the webinar and in Gilbane’s Smart Content report touch on XML for localization and translation.
Our new year’s resolution is to get back to regular blogging. We’ll start with an easy but time-sensitive post.
After three years in Berlin, Localization World moves to Barcelona this year. The event takes place 14-16 June.
The theme of this year’s conference is innovation. Based on what we saw happening with content globalization practices throughout the second half of 2010, innovation is top-of-mind for all industry constituents. Services business models are evolving, driven by strategic collaboration among buyers and sellers of translation services. Technologies for automating the manual tasks associated with content globalization are maturing rapidly. Gilbane’s research shows steady progress towards overcoming language afterthought syndrome, as more and more companies realize that one or two key investments can stem the money drain caused by redundant processes. Innovation, indeed.