Blog | Gilbane.com

Unifying the Global Content Value Chain: An Interview with Lasselle Ramsay

Second in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Joan Lasselle, President of Lasselle Ramsay. Lasselle Ramsay is a service provider that designs solutions for content and learning that align how users work with the information needed to achieve business results. We talked with Joan about her company, why they supported the research, and what surprised her about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Lasselle: Lasselle Ramsay is a professional service provider, not a reseller or technology integrator. We focus on helping companies develop new product content. Our work spans the value chain, ranging from engineering (at the point of origin), to technical marketing and technical documentation, to learning organizations and support teams. We also look at the extended value chain, which includes partners, suppliers (like translation service providers), and customers.

We encourage our clients to operate in both the strategic and tactical domains, providing them with a strategic vision, and helping implement an infrastructure that can deliver structured and unstructured multilingual content.

Gilbane: Why did you choose to sponsor the Gilbane research?

Lasselle: One of our goals as a service provider is to add value at each stage across the chain. This research study enables us to discover and share the experience and perspective of industry leaders with Lasselle Ramsay clients. We chose this particular study because of the in-depth research, as well as Gilbane’s domain expertise and independence.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Lasselle: Gilbane’s report sheds light on two key issues that our clients face: the need to address content within the context of larger business trends [referred to as megatrends in the study], and the importance of process improvements. First, companies today are challenged repeatedly to address adverse economic pressures at the same time they respond to the megatrends, such as the evolving basis of competitive advantage. The report makes clear that companies must take measures to address these megatrends in their content practices, or risk being left behind. Even in the face of negative economics and an endless and escalating flood of new data, they cannot sit back and wait. Second, the report illustrates how organizations can benefit from improving cross-functional processes. In many companies, for example, engineering and tech pubs each have their own authoring, content management, translation, and publishing, and neither group shares any processes or tools. What a lost opportunity! Just think of how much they could lower costs and speed time to market if they coordinated processes and collaborated on process improvements.

For insights into the megatrends that are shaping content globalization practices, see “Market Context” on page 9 of the report. You can also read about how Lasselle Ramsay contributed to global content value chain development at Hewlett-Packard. Download the study for free.

Read More

Follow CMS Companies and Communities on Twitter

I moved our list of CMS Companies and Communities to . Much easier way to share the list!

Read More

Content Management Trends and Topics at Upcoming Conference

We are ramping up for our annual Boston conference, and the program is mostly complete. Our tagline this year is "Content, Collaboration & Customers", and as usual, we’ll be discussing a wide range of related topics and covering all the important trends. Four areas we are paying extra attention too are:

Managing enterprise social content. This should not be a surprise. The increasing use of social software in business and government environments for both internal and customer communications means more content, of a different kind, to be managed.

Managing enterprise mobile content. Smartphones are replacing noteboooks and desktops as clients  for many enterprise applications, and complementing them for even more. Mobile is another enterprise channel with unique content requirements.

SharePoint & Office 2010 and web content management. As the SharePoint surge continues with the upcoming release of 2010, early signs point to increased emphasis on web content management and integration between WCM, Office and SharePoint. How will this affect the content management market?

E-government & transparency. We are seeing a lot of activity here among both state and federal agencies, and there are special content management challenges that in many (most?) projects mean integrating new technologies and practices (e.g., social software) with established information management approaches (e.g., XML, XBRL).

Stay tuned for updates, or follow the conference on Twitter at http://twitter.com/gilbaneboston.

Conference links:

Read More

Reflections on Gov 2.0 Expo and Summit

O’Reilly’s Gov 2.0 events took place last week. We’ve had some time to think about what the current wave of activity means to buyers and adopters of content technologies.

Both the Expo and Summit programs delivered a deluge of examples of exciting new approaches to connecting consumers of government services with the agencies and organizations that provide them. 

  • At the Expo on Sept 8,  25 speakers from organizations like NASA, TSA, US EPA, City of Santa Cruz,  Utah Department of Public Safety, and the US Coast Guard provided five-minute overviews of their 2.0 applications in a sometimes dizzying fast-paced format.
  • Sunlight Labs sponsored an Apps for America challenge that featured finalists who combined federal content available on Data.gov and open source software in some intriguing applications, including DataMasher, which enables you to mash up sources such as stats on numbers of high school graduates and guns per household.
  • The Summit on Sept 9 and 10 featured more applications plus star-status speakers including Aneesh Chopra, the US’s first CTO operating under the Federal Office of Science and Technology Policy; Vinton Cerf, currently VP and evangelist at Google; and Mitch Kapor.

A primary program theme was "government as platform," with speakers suggesting and debating just what that means. There was much thoughtful discussion, if not consensus. Rather than report, interested readers can search Twitter hash tags #gov20e and #gov20s for comments.

From the first speaker on, we were immediately struck by the rapid pace of change in government action and attitude about content and data sharing. Our baseline for comparison is Gilbane’s last conference on content applications within government and non-profit agencies in June 2007. In presentations and casual conversations with attendees, it was clear that most organizations were operating as silos. There was little sharing or collaboration within and among organizations. Many attendees expressed frustration that this was so. When we asked what could be done to fix the problem, we distinctly remember one person saying that connecting with other content managers just within her own agency would be a huge improvement.

Fast forward a little over two years to last week’s Gov2.0 events. Progress towards internal collaboration, inter-agency data sharing, and two-way interaction between government and citizens is truly remarkable. At least three factors have created a pefect storm of conditions: the current administration’s vision and mandate for open government, broad acceptance of social interaction tools at the personal and organizational level, and technology readiness in the form of open source software that makes it possible to experiment at low cost and risk.

Viewing the events through Gilbane’s content-centric lens, we offer three takeaways:

  • Chopra indicated that the formal Open Government directives to agencies, to be released in several weeks, will include the development of "structured schedules" for making agency data available in machine-readable format. As Tim O’Reilly said while interviewing Chopra, posting "a bunch of PDFs" will not be sufficient for alignment with the directives. As a result, agencies will be accelerating the adoption of XML and the transformation of publishing practices to manage structured content. As a large buyer of content technologies and services, government agencies are market influencers. We will be watching carefully for the impact of Open Government initiatives on the broader landscape for content technologies.
  • There was little mention of the role of content management as a business practice or technology infrastructure. This is not surprising, given that Gov2.0 wasn’t about content management. And while the programs comprised lots of show-and-tell examples, most were very heavy on show and very light on tell. But it does raise a question about how these applications will be managed, governed, and made sustainable and scalable. Add in the point above — that structured content will now be poised for wider adoption, creating demand for XML-aware content management solutions. Look for more discussion as agencies begin to acknowledge their content management challenges.
  • We didn’t hear a single mention of language issues in the sessions we attended. Leaving us to wonder if non-native English speakers who are eligible for government services will be disenfranchised in the move to Open Government.

All in all, thought-provoking, well-executed events. For details, videos of the sessions are available on the Gov2.0 site.

Read More

Component Content Management and the Global Content Value Chain: An Interview with Suzanne Mescan of Vasont

First in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

Recently we had an opportunity to catch up with Suzanne Mescan, Vice President of Marketing for Vasont Systems. Vasont is a leading provider of component content management systems built upon XML standards. Suzanne spoke with us about the global content value chain (GCVC) and important findings from the research.

 Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?)  
 
Mescan: We are the “manage” phase of the GCVC, providing component content management solutions that include multiple automatic and user-defined content reuse capabilities, project management, built-in workflow, integrated collaborative review, translation management, support for any DTD, and much more.
 
Gilbane: Why did you choose to sponsor the Gilbane research? 
 
Mescan: As part of the GCVC, we felt it was important for us and for those organizations looking to change and enhance their product content strategies to understand the positive trends and direction of the industry from beginning to end. Being a sponsor enabled this research to take place through The Gilbane Group, a group who has the pulse of this space in the industry.
 
Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?
 
Mescan: The most interesting result in the report was that terminology management ranked highest in the approach to standardization of content creation and that this terminology management is still a manual process based on a spreadsheet for half of the respondents. Yet “paper-based style guidelines and glossaries did little to encourage real adoption.” Being a key to global customer experience, brand management, and quality and consistency to 80% of the respondents, it is surprising that terminology management, as well as other content creation standardization practices, is still such a manual process.
 
For more about current terminology management practices, see "Achieving Quality at the Source" on page 28 of the Gilbane report. You can also read about how Vasont customer Mercury Marine is deploying content management as part of its global content value chain. Download the study for free.  
Read More

Conversations with Globalization Solution Providers

The research for Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains was supported by seven companies with proven track records in content globalization. Their technologies and services are used by market-leading companies to create competitive advantage with multilingual content.

One of the goals of this blog is to provide buyers and adopters with a variety of perspectives on content globalization strategies, practices, and solutions. The Multilingual Product Content study is authored from our own analyst perspective, drawing on the results of research. The user perspective is captured in the profiles included in the report; they describe the global content value chains deployed at Adobe, BMW Motorrad, Cisco Systems, HP, Mercury Marine, and New York City Department of Education.

To bring the solution supplier perspective into the mix, over the next month or so we’ll publish a series of brief interviews with study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR Group, Systran, and Vasont Systems. A representative from each company answers three questions:

  1. What role does your company play in the global content value chain?
  2. Why did you elect to sponsor Gilbane’s research?
  3. What was the most compelling or interesting result to come out of the research?

Readers will be able to comment on the interviews and ask questions of the supplier. We’ll post answers that are appropriate for sharing.

Our first interview with Suzanne Mescan from Vasont will be published next week.

Read More

Some Excellent Coverage of our Digital Publishing Report

Over at TeleRead, David Rothman has a really fine writeup discussing our digital publishing report. He summarizes some of our key points about asset management and flexibility, but also raises some interesting related issues about DRM and the risks of "publishers as mixmasters."

My thanks to David for his thoughtful response.

Read More

Gilbane at Localization World Silicon Valley

Mary Laplante, Senior Analyst, speaks on the topic of Overcoming Language Afterthought Syndrome:

Gilbane’s 2009 research on multilingual content indicates that global companies are making steady progress towards overcoming language afterthought syndrome – a pattern of treating language requirements as secondary considerations within their content strategies and solutions. This presentation delivers insight into how market-leading companies are adopting content globalization strategies, practices, and infrastructures that position language requirements as integral to end-to-end solutions rather than as ancillary post-processes. The session is designed for content and language professionals and managers who need to know how to bring capabilities like automated translation management, terminology management, multilingual multichannel publishing, and global content management into the mainstream. Takeaways include data and case studies that can be used in business cases to move language requirements out of the back room once and for all.

Localization World Silicon Valley, 20-22 October, Santa Clara Convention Center

Read More

Enterprise 2.0 is Neither a Crock Nor the Entire Solution

Dennis Howlett has once again started a useful and important debate, this time with his Irregular Enterprise blog post entitled Enterprise 2.0: what a crock. While I am sympathetic to some of the thinking he expressed, I felt the need to address one point Dennis raised and a question he asked.

I very much agree with this statement by Dennis:

"Like it or not, large enterprises – the big name brands – have to work in structures and hierarchies…"

However, I strongly disagree with his related contention ("the Big Lie" as he terms it) that:

"Enterprise 2.0 pre-supposes that you can upend hierarchies for the benefit of all."

Dennis also posed a question that probably echoes what many business leaders are asking:

"In the meantime, can someone explain to me the problem Enterprise 2.0 is trying to solve?"

Below is the comment that I left on Dennis’ blog. It begins to answer the final question he asked and address my disagreement with his contention that Enterprise 2.0 advocates seek to create anarchy. Is my vision for the co-existence of structured and recombinant organizational and work models clear and understandable? Reasonable and viable? If not, I will expand my thoughts in a future post. Please let me know what you think.

Enterprise 2.0 is trying to solve a couple levels of problems.

From a technology standpoint, E2.0 is addressing the failure of existing enterprise systems to provide users with a way to work through exceptions in defined business processes during their execution. E2.0 technology does this by helping the user identify and communicate with those who can help deal with the issue; it also creates a discoverable record of the solution for someone facing a similar issue in the future.

From a organizational and cultural perspective, E2.0 is defining a way of operating for companies that reflects the way work is actually accomplished — by peer-to-peer interaction, not through command and control hierarchy. Contrary to your view, E2.0 does not pre-suppose the destruction of hierarchy. Correctly implemented (philosophy and technology), E2.0 provides management a view of the company that is complementary to the organization chart.

Addendum: See this previous post for more of my perspective on the relationship of structured and ad hoc methods of working.

Read More

XML’s Role in Digital Archives

I have a new post over at EMC’s Community site, "Preserving Electronic Public Records: Lessons from the Washington State Digital Archives." This is part of our ongoing series for EMC on the use of ECM and XML in the public sector.

Read More