Curated for content, computing, data, information, and digital experience professionals

Category: Content management & strategy (Page 142 of 481)

This category includes editorial and news blog posts related to content management and content strategy. For older, long form reports, papers, and research on these topics see our Resources page.

Content management is a broad topic that refers to the management of unstructured or semi-structured content as a standalone system or a component of another system. Varieties of content management systems (CMS) include: web content management (WCM), enterprise content management (ECM), component content management (CCM), and digital asset management (DAM) systems. Content management systems are also now widely marketed as Digital Experience Management (DEM or DXM, DXP), and Customer Experience Management (CEM or CXM) systems or platforms, and may include additional marketing technology functions.

Content strategy topics include information architecture, content and information models, content globalization, and localization.

For some historical perspective see:

https://gilbane.com/gilbane-report-vol-8-num-8-what-is-content-management/

Unifying the Global Content Value Chain: An Interview with Lasselle Ramsay

Second in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Joan Lasselle, President of Lasselle Ramsay. Lasselle Ramsay is a service provider that designs solutions for content and learning that align how users work with the information needed to achieve business results. We talked with Joan about her company, why they supported the research, and what surprised her about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Lasselle: Lasselle Ramsay is a professional service provider, not a reseller or technology integrator. We focus on helping companies develop new product content. Our work spans the value chain, ranging from engineering (at the point of origin), to technical marketing and technical documentation, to learning organizations and support teams. We also look at the extended value chain, which includes partners, suppliers (like translation service providers), and customers.

We encourage our clients to operate in both the strategic and tactical domains, providing them with a strategic vision, and helping implement an infrastructure that can deliver structured and unstructured multilingual content.

Gilbane: Why did you choose to sponsor the Gilbane research?

Lasselle: One of our goals as a service provider is to add value at each stage across the chain. This research study enables us to discover and share the experience and perspective of industry leaders with Lasselle Ramsay clients. We chose this particular study because of the in-depth research, as well as Gilbane’s domain expertise and independence.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Lasselle: Gilbane’s report sheds light on two key issues that our clients face: the need to address content within the context of larger business trends [referred to as megatrends in the study], and the importance of process improvements. First, companies today are challenged repeatedly to address adverse economic pressures at the same time they respond to the megatrends, such as the evolving basis of competitive advantage. The report makes clear that companies must take measures to address these megatrends in their content practices, or risk being left behind. Even in the face of negative economics and an endless and escalating flood of new data, they cannot sit back and wait. Second, the report illustrates how organizations can benefit from improving cross-functional processes. In many companies, for example, engineering and tech pubs each have their own authoring, content management, translation, and publishing, and neither group shares any processes or tools. What a lost opportunity! Just think of how much they could lower costs and speed time to market if they coordinated processes and collaborated on process improvements.

For insights into the megatrends that are shaping content globalization practices, see “Market Context” on page 9 of the report. You can also read about how Lasselle Ramsay contributed to global content value chain development at Hewlett-Packard. Download the study for free.

Content Management Trends and Topics at Upcoming Conference

We are ramping up for our annual Boston conference, and the program is mostly complete. Our tagline this year is “Content, Collaboration & Customers”, and as usual, we’ll be discussing a wide range of related topics and covering all the important trends. Four areas we are paying extra attention too are:

Managing enterprise social content. This should not be a surprise. The increasing use of social software in business and government environments for both internal and customer communications means more content, of a different kind, to be managed.

Managing enterprise mobile content. Smartphones are replacing noteboooks and desktops as clients  for many enterprise applications, and complementing them for even more. Mobile is another enterprise channel with unique content requirements.

SharePoint & Office 2010 and web content management. As the SharePoint surge continues with the upcoming release of 2010, early signs point to increased emphasis on web content management and integration between WCM, Office and SharePoint. How will this affect the content management market?

E-government & transparency. We are seeing a lot of activity here among both state and federal agencies, and there are special content management challenges that in many (most?) projects mean integrating new technologies and practices (e.g., social software) with established information management approaches (e.g., XML, XBRL).

Stay tuned for updates, or follow the conference on Twitter at http://twitter.com/gilbaneboston.

OpenLogic and Nuxeo Partner on Open Source Enterprise Content Management Stack

OpenLogic, Inc. and Nuxeo have announced they are partnering to provide top to bottom support on an ECM stack, which includes Nuxeo’s Enterprise Platform, the JBoss application server and the PostgreSQL database. By supporting a specific stack of technologies, Nuxeo and OpenLogic are offering support to businesses of all sizes for a large open source ECM alternative that rivals the depth of functionality provided by proprietary vendors. OpenLogic is adding Nuxeo to the OpenLogic Certified Library, which contains a wide range of open source applications and infrastructure. Both OpenLogic and Nuxeo will sell and provide front line support for the fully integrated Nuxeoenterprise content management Stack, which includes JBoss and PostgreSQL. Support is available at a variety of service levels. http://www.openlogic.com / http://www.nuxeo.com

Reflections on Gov 2.0 Expo and Summit

O’Reilly’s Gov 2.0 events took place last week. We’ve had some time to think about what the current wave of activity means to buyers and adopters of content technologies.

Both the Expo and Summit programs delivered a deluge of examples of exciting new approaches to connecting consumers of government services with the agencies and organizations that provide them.

  • At the Expo on Sept 8,  25 speakers from organizations like NASA, TSA, US EPA, City of Santa Cruz,  Utah Department of Public Safety, and the US Coast Guard provided five-minute overviews of their 2.0 applications in a sometimes dizzying fast-paced format.
  • Sunlight Labs sponsored an Apps for America challenge that featured finalists who combined federal content available on Data.gov and open source software in some intriguing applications, including DataMasher, which enables you to mash up sources such as stats on numbers of high school graduates and guns per household.
  • The Summit on Sept 9 and 10 featured more applications plus star-status speakers including Aneesh Chopra, the US’s first CTO operating under the Federal Office of Science and Technology Policy; Vinton Cerf, currently VP and evangelist at Google; and Mitch Kapor.

A primary program theme was “government as platform,” with speakers suggesting and debating just what that means. There was much thoughtful discussion, if not consensus. Rather than report, interested readers can search Twitter hash tags #gov20e and #gov20s for comments.

From the first speaker on, we were immediately struck by the rapid pace of change in government action and attitude about content and data sharing. Our baseline for comparison is Gilbane’s last conference on content applications within government and non-profit agencies in June 2007. In presentations and casual conversations with attendees, it was clear that most organizations were operating as silos. There was little sharing or collaboration within and among organizations. Many attendees expressed frustration that this was so. When we asked what could be done to fix the problem, we distinctly remember one person saying that connecting with other content managers just within her own agency would be a huge improvement.

Fast forward a little over two years to last week’s Gov2.0 events. Progress towards internal collaboration, inter-agency data sharing, and two-way interaction between government and citizens is truly remarkable. At least three factors have created a pefect storm of conditions: the current administration’s vision and mandate for open government, broad acceptance of social interaction tools at the personal and organizational level, and technology readiness in the form of open source software that makes it possible to experiment at low cost and risk.

Viewing the events through Gilbane’s content-centric lens, we offer three takeaways:

  • Chopra indicated that the formal Open Government directives to agencies, to be released in several weeks, will include the development of “structured schedules” for making agency data available in machine-readable format. As Tim O’Reilly said while interviewing Chopra, posting “a bunch of PDFs” will not be sufficient for alignment with the directives. As a result, agencies will be accelerating the adoption of XML and the transformation of publishing practices to manage structured content. As a large buyer of content technologies and services, government agencies are market influencers. We will be watching carefully for the impact of Open Government initiatives on the broader landscape for content technologies.
  • There was little mention of the role of content management as a business practice or technology infrastructure. This is not surprising, given that Gov2.0 wasn’t about content management. And while the programs comprised lots of show-and-tell examples, most were very heavy on show and very light on tell. But it does raise a question about how these applications will be managed, governed, and made sustainable and scalable. Add in the point above — that structured content will now be poised for wider adoption, creating demand for XML-aware content management solutions. Look for more discussion as agencies begin to acknowledge their content management challenges.
  • We didn’t hear a single mention of language issues in the sessions we attended. Leaving us to wonder if non-native English speakers who are eligible for government services will be disenfranchised in the move to Open Government.

All in all, thought-provoking, well-executed events. For details, videos of the sessions are available on the Gov2.0 site.

Component Content Management and the Global Content Value Chain: An Interview with Suzanne Mescan of Vasont

First in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

Recently we had an opportunity to catch up with Suzanne Mescan, Vice President of Marketing for Vasont Systems. Vasont is a leading provider of component content management systems built upon XML standards. Suzanne spoke with us about the global content value chain (GCVC) and important findings from the research.

 Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?)  
 
Mescan: We are the “manage” phase of the GCVC, providing component content management solutions that include multiple automatic and user-defined content reuse capabilities, project management, built-in workflow, integrated collaborative review, translation management, support for any DTD, and much more.
 
Gilbane: Why did you choose to sponsor the Gilbane research? 
 
Mescan: As part of the GCVC, we felt it was important for us and for those organizations looking to change and enhance their product content strategies to understand the positive trends and direction of the industry from beginning to end. Being a sponsor enabled this research to take place through The Gilbane Group, a group who has the pulse of this space in the industry.
 
Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?
 
Mescan: The most interesting result in the report was that terminology management ranked highest in the approach to standardization of content creation and that this terminology management is still a manual process based on a spreadsheet for half of the respondents. Yet “paper-based style guidelines and glossaries did little to encourage real adoption.” Being a key to global customer experience, brand management, and quality and consistency to 80% of the respondents, it is surprising that terminology management, as well as other content creation standardization practices, is still such a manual process.
 
For more about current terminology management practices, see “Achieving Quality at the Source” on page 28 of the Gilbane report. You can also read about how Vasont customer Mercury Marine is deploying content management as part of its global content value chain. Download the study for free.  

Avantstar Releases Transit Solutions 9

Avantstar, Inc.has announced the release of Transit Solutions 9, file conversion and web publishing software that takes content originally designed for print or offline use, such as policies and procedures information, and publishes it as fully formatted web content. Transit Solutions is designed to help IT departments and web content managers keep up with the high volume of website changes generated everyday by not only automatically converting and publishing hundreds of file types as HTML, but also recognizing and automatically linking within documents (not just among them). Transit also publishes entire information folders on a preset schedule. New features in Transit Solutions 9 include: a redesigned Edit Template window that combines preview, publication hierarchy and properties into one window, making it easier to access commonly used features to edit web page templates; enhanced support for embedded content in Microsoft Word 2007, Excel 2007 and PowerPoint 2007; support for merged cells in Excel and tables within tables; as well as enhanced conversion support for Star Office and Open Office suites. http://www.avantstar.com

Conversations with Globalization Solution Providers

The research for Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains was supported by seven companies with proven track records in content globalization. Their technologies and services are used by market-leading companies to create competitive advantage with multilingual content.

One of the goals of this blog is to provide buyers and adopters with a variety of perspectives on content globalization strategies, practices, and solutions. The Multilingual Product Content study is authored from our own analyst perspective, drawing on the results of research. The user perspective is captured in the profiles included in the report; they describe the global content value chains deployed at Adobe, BMW Motorrad, Cisco Systems, HP, Mercury Marine, and New York City Department of Education.

To bring the solution supplier perspective into the mix, over the next month or so we’ll publish a series of brief interviews with study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR Group, Systran, and Vasont Systems. A representative from each company answers three questions:

  1. What role does your company play in the global content value chain?
  2. Why did you elect to sponsor Gilbane’s research?
  3. What was the most compelling or interesting result to come out of the research?

Readers will be able to comment on the interviews and ask questions of the supplier. We’ll post answers that are appropriate for sharing.

Our first interview with Suzanne Mescan from Vasont will be published next week.

MadCap Lingo 3.0 for Authors and Translators Released

MadCap SoftwareExternal link has announced that MadCap Lingo 3.0 is now available. MadCap LingoExternal link, the XML-based, fully integrated translation memory system (TMS) and authoring tool solution, eliminates the need for file transfers in order to complete translation-preserving valuable content and formatting to deliver a consistent experience across multiple languages. With version 3.0, MadCap Lingo adds a new Project Packager function that bridges the gap between authors and translators who use other TMS software. Using the Project Packager in MadCap Lingo, authors should be able to work with translators to streamline the translation process, track the status of completion, and obtain more accurate project cost estimates. MadCap Lingo 3.0 also features a new TermBase Editor for creating databases of reusable translated terms, and enhanced translation memory. Through integration between MadCap Lingo and MadCap’s authoring and multimedia applications, MadCap hopes to offer a powerful integrated authoring and localization workflow. Project Packager in MadCap Lingo 3.0 is designed to make it easier for authors who need their documentation translated into another language but work with a translator who relies on a TMS tool other than MadCap Lingo. Using Project Packager, the author can create a MadCap Lingo project with all the files that require translation, and bundle it in a ZIP file and send it to the translator. MadCap Lingo displays a list of all files that need to be translated, going beyond text to include skins, glossaries, search filter sets, and much more. As a result, the author can ensure that the translator receives all of the files requiring translation. This should streamline the process while enabling more accurate translation project estimates, helping translators to avoid accidentally underestimating project costs based on an incomplete file count-and protecting authors from unexpected cost overruns. Once the translation is complete, the translator sends a ZIP file with the content. The author then simply merges the translated file in MadCap Lingo, which is used to confirm the completeness of the translation. The author can then run statistical reports showing information for each project and file to determine what has/ has not been translated, how many words/segments have been translated and/or still need to be translated, and much more. The author can then export the MadCap Lingo project to a range of outputs, such as a Flare project file for online and print publishing, Word document, or even a Darwin Information Typing Architecture (DITA) file, among others. The key new features of MadCap Lingo 3.0 are: the new TermBase Editor which enables translators to create and manage concept-oriented, multilingual terminology databases, “termbases,” making it significantly easier to reuse translated terms; the ability to import and export Term Base eXchange (TBX) files, an open, XML-based standard used for exchanging structured terminological data; translation memory – Apply Suggestions to Project function, which makes it possible to view and automatically apply translation memory suggestions to an entire project, rather than just one segment, saving hours of effort; dynamic help window pane lock lets the translator keep the current help topic frozen in place while moving around in the MadCap Lingo interface, making it easier to follow steps or other information placed in the Help topic; minimize to system tray option; multiple file support allows multiple files to be selected when creating a new MadCap Lingo project, for example HTM, HTML, XML, DITA or DOC files. http://www.madcapsoftware.com/

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑