Curated for content, computing, and digital experience professionals

Year: 2009 (Page 10 of 39)

Unifying the Global Content Value Chain: An Interview with Lasselle Ramsay

Second in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Joan Lasselle, President of Lasselle Ramsay. Lasselle Ramsay is a service provider that designs solutions for content and learning that align how users work with the information needed to achieve business results. We talked with Joan about her company, why they supported the research, and what surprised her about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Lasselle: Lasselle Ramsay is a professional service provider, not a reseller or technology integrator. We focus on helping companies develop new product content. Our work spans the value chain, ranging from engineering (at the point of origin), to technical marketing and technical documentation, to learning organizations and support teams. We also look at the extended value chain, which includes partners, suppliers (like translation service providers), and customers.

We encourage our clients to operate in both the strategic and tactical domains, providing them with a strategic vision, and helping implement an infrastructure that can deliver structured and unstructured multilingual content.

Gilbane: Why did you choose to sponsor the Gilbane research?

Lasselle: One of our goals as a service provider is to add value at each stage across the chain. This research study enables us to discover and share the experience and perspective of industry leaders with Lasselle Ramsay clients. We chose this particular study because of the in-depth research, as well as Gilbane’s domain expertise and independence.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Lasselle: Gilbane’s report sheds light on two key issues that our clients face: the need to address content within the context of larger business trends [referred to as megatrends in the study], and the importance of process improvements. First, companies today are challenged repeatedly to address adverse economic pressures at the same time they respond to the megatrends, such as the evolving basis of competitive advantage. The report makes clear that companies must take measures to address these megatrends in their content practices, or risk being left behind. Even in the face of negative economics and an endless and escalating flood of new data, they cannot sit back and wait. Second, the report illustrates how organizations can benefit from improving cross-functional processes. In many companies, for example, engineering and tech pubs each have their own authoring, content management, translation, and publishing, and neither group shares any processes or tools. What a lost opportunity! Just think of how much they could lower costs and speed time to market if they coordinated processes and collaborated on process improvements.

For insights into the megatrends that are shaping content globalization practices, see “Market Context” on page 9 of the report. You can also read about how Lasselle Ramsay contributed to global content value chain development at Hewlett-Packard. Download the study for free.

eTouch Releases SamePage 4.2

eTouch SamePage has announced it has released SamePage version 4.2 with a customizable dashboard, stronger e-mail integration and advanced plug-in features in order to meet the needs of a remote and social-media friendly workforce. SamePage version 4.2 includes a number of new features important to remote and knowledge management workers including: advanced email integration that enables remote workers to more easily ‘wikify’ content from any smartphone; dynamic portal-like dashboard with drag and drop widgets that can be personalized at company and individual levels, eliminating the need for a separate intranet portal product; a wide selection of new plug-ins that can be customized for each enterprise, including tag cloud and project member plug-ins, among others; more detailed analytics, usage and content reports for administrators; ratings on Blog posts and more blog analytics; Support for Microsoft Office 2007; WYSIWYG enhancements to further simplify the user interface; and increased privileges for the system user or administrator to manage pages and projects. http://www.etouch.net/

Content Management Trends and Topics at Upcoming Conference

We are ramping up for our annual Boston conference, and the program is mostly complete. Our tagline this year is “Content, Collaboration & Customers”, and as usual, we’ll be discussing a wide range of related topics and covering all the important trends. Four areas we are paying extra attention too are:

Managing enterprise social content. This should not be a surprise. The increasing use of social software in business and government environments for both internal and customer communications means more content, of a different kind, to be managed.

Managing enterprise mobile content. Smartphones are replacing noteboooks and desktops as clients  for many enterprise applications, and complementing them for even more. Mobile is another enterprise channel with unique content requirements.

SharePoint & Office 2010 and web content management. As the SharePoint surge continues with the upcoming release of 2010, early signs point to increased emphasis on web content management and integration between WCM, Office and SharePoint. How will this affect the content management market?

E-government & transparency. We are seeing a lot of activity here among both state and federal agencies, and there are special content management challenges that in many (most?) projects mean integrating new technologies and practices (e.g., social software) with established information management approaches (e.g., XML, XBRL).

Stay tuned for updates, or follow the conference on Twitter at http://twitter.com/gilbaneboston.

Jive Announces Jive Market Engagement

Jive has announced Jive Market Engagement, a new solution that combines the power of social media monitoring with Social Business Software. Jive’s Market Engagement Solution aims to help organizations implement a unified social media strategy to interact with customers. The solution helps to socialize observations from Twitter, Facebook, blogs, and other online sources to move faster in the moment of pain or the moment of opportunity. While a small percentage of organizations use monitoring tools, most companies rely on a patchwork of alerts, people, and email to stay on top of conversations.  While these approaches may help organizations listen in on the river of conversations occurring across the social web, they certainly don’t allow companies to measure, share and engage with insights in a productive and timely way. Organizations should be able to listen, measure, engage and share insights using Jive Market Engagement to identify real-time issues, collaborate with a set of colleagues internally and ultimately more and engage in active dialogues with customers and influencers. The Jive Market Engagement Solution is designed to help organizations proactively monitor brand or product issues and competitive threats; enable quick collaboration on appropriate responses or interventions; and elevate and broaden the social conversations with a company. The “war room” environment allows for rapid decision- making from the right people within the organization. These observations are then consolidated into market summary reports, what Jive calls “Viewpoints.”  Viewpoints can be shared within an organization to foster collaboration and to develop and implement appropriate responses.  Jive Market Engagement also provides the ability to analyze the effectiveness of those responses. www.jivesoftware.com/

Adobe to Acquire Omniture

Adobe Systems Incorporated (Nasdaq:ADBE) and Omniture, Inc. (Nasdaq:OMTR) announced the two companies have entered into a definitive agreement for Adobe to acquire Omniture in a transaction valued at approximately $1.8 billion on a fully diluted equity-value basis. Under the terms of the agreement, Adobe will commence a tender offer to acquire all of the outstanding common stock of Omniture for $21.50 per share in cash. By combining Adobe’s content creation tools and clients with Omniture’s Web analytics, measurement and optimization technologies Adobe will deliver solutions intended to enhance engaging experiences and e-commerce across all digital content, platforms and devices. For designers, developers and online marketers, an integrated workflow, with optimization capabilities embedded in the creation tools, will streamline the creation and delivery of relevant content and applications. This optimization will help advertisers and advertising agencies, publishers, and e-tailers to realize greater ROI from their digital media investments and improve their end users’ experiences. http://www.adobe.com http://www.omniture.com

OpenLogic and Nuxeo Partner on Open Source Enterprise Content Management Stack

OpenLogic, Inc. and Nuxeo have announced they are partnering to provide top to bottom support on an ECM stack, which includes Nuxeo’s Enterprise Platform, the JBoss application server and the PostgreSQL database. By supporting a specific stack of technologies, Nuxeo and OpenLogic are offering support to businesses of all sizes for a large open source ECM alternative that rivals the depth of functionality provided by proprietary vendors. OpenLogic is adding Nuxeo to the OpenLogic Certified Library, which contains a wide range of open source applications and infrastructure. Both OpenLogic and Nuxeo will sell and provide front line support for the fully integrated Nuxeoenterprise content management Stack, which includes JBoss and PostgreSQL. Support is available at a variety of service levels. http://www.openlogic.com / http://www.nuxeo.com

Reflections on Gov 2.0 Expo and Summit

O’Reilly’s Gov 2.0 events took place last week. We’ve had some time to think about what the current wave of activity means to buyers and adopters of content technologies.

Both the Expo and Summit programs delivered a deluge of examples of exciting new approaches to connecting consumers of government services with the agencies and organizations that provide them.

  • At the Expo on Sept 8,  25 speakers from organizations like NASA, TSA, US EPA, City of Santa Cruz,  Utah Department of Public Safety, and the US Coast Guard provided five-minute overviews of their 2.0 applications in a sometimes dizzying fast-paced format.
  • Sunlight Labs sponsored an Apps for America challenge that featured finalists who combined federal content available on Data.gov and open source software in some intriguing applications, including DataMasher, which enables you to mash up sources such as stats on numbers of high school graduates and guns per household.
  • The Summit on Sept 9 and 10 featured more applications plus star-status speakers including Aneesh Chopra, the US’s first CTO operating under the Federal Office of Science and Technology Policy; Vinton Cerf, currently VP and evangelist at Google; and Mitch Kapor.

A primary program theme was “government as platform,” with speakers suggesting and debating just what that means. There was much thoughtful discussion, if not consensus. Rather than report, interested readers can search Twitter hash tags #gov20e and #gov20s for comments.

From the first speaker on, we were immediately struck by the rapid pace of change in government action and attitude about content and data sharing. Our baseline for comparison is Gilbane’s last conference on content applications within government and non-profit agencies in June 2007. In presentations and casual conversations with attendees, it was clear that most organizations were operating as silos. There was little sharing or collaboration within and among organizations. Many attendees expressed frustration that this was so. When we asked what could be done to fix the problem, we distinctly remember one person saying that connecting with other content managers just within her own agency would be a huge improvement.

Fast forward a little over two years to last week’s Gov2.0 events. Progress towards internal collaboration, inter-agency data sharing, and two-way interaction between government and citizens is truly remarkable. At least three factors have created a pefect storm of conditions: the current administration’s vision and mandate for open government, broad acceptance of social interaction tools at the personal and organizational level, and technology readiness in the form of open source software that makes it possible to experiment at low cost and risk.

Viewing the events through Gilbane’s content-centric lens, we offer three takeaways:

  • Chopra indicated that the formal Open Government directives to agencies, to be released in several weeks, will include the development of “structured schedules” for making agency data available in machine-readable format. As Tim O’Reilly said while interviewing Chopra, posting “a bunch of PDFs” will not be sufficient for alignment with the directives. As a result, agencies will be accelerating the adoption of XML and the transformation of publishing practices to manage structured content. As a large buyer of content technologies and services, government agencies are market influencers. We will be watching carefully for the impact of Open Government initiatives on the broader landscape for content technologies.
  • There was little mention of the role of content management as a business practice or technology infrastructure. This is not surprising, given that Gov2.0 wasn’t about content management. And while the programs comprised lots of show-and-tell examples, most were very heavy on show and very light on tell. But it does raise a question about how these applications will be managed, governed, and made sustainable and scalable. Add in the point above — that structured content will now be poised for wider adoption, creating demand for XML-aware content management solutions. Look for more discussion as agencies begin to acknowledge their content management challenges.
  • We didn’t hear a single mention of language issues in the sessions we attended. Leaving us to wonder if non-native English speakers who are eligible for government services will be disenfranchised in the move to Open Government.

All in all, thought-provoking, well-executed events. For details, videos of the sessions are available on the Gov2.0 site.

Component Content Management and the Global Content Value Chain: An Interview with Suzanne Mescan of Vasont

First in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

Recently we had an opportunity to catch up with Suzanne Mescan, Vice President of Marketing for Vasont Systems. Vasont is a leading provider of component content management systems built upon XML standards. Suzanne spoke with us about the global content value chain (GCVC) and important findings from the research.

 Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?)  
 
Mescan: We are the “manage” phase of the GCVC, providing component content management solutions that include multiple automatic and user-defined content reuse capabilities, project management, built-in workflow, integrated collaborative review, translation management, support for any DTD, and much more.
 
Gilbane: Why did you choose to sponsor the Gilbane research? 
 
Mescan: As part of the GCVC, we felt it was important for us and for those organizations looking to change and enhance their product content strategies to understand the positive trends and direction of the industry from beginning to end. Being a sponsor enabled this research to take place through The Gilbane Group, a group who has the pulse of this space in the industry.
 
Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?
 
Mescan: The most interesting result in the report was that terminology management ranked highest in the approach to standardization of content creation and that this terminology management is still a manual process based on a spreadsheet for half of the respondents. Yet “paper-based style guidelines and glossaries did little to encourage real adoption.” Being a key to global customer experience, brand management, and quality and consistency to 80% of the respondents, it is surprising that terminology management, as well as other content creation standardization practices, is still such a manual process.
 
For more about current terminology management practices, see “Achieving Quality at the Source” on page 28 of the Gilbane report. You can also read about how Vasont customer Mercury Marine is deploying content management as part of its global content value chain. Download the study for free.  
« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑