Curated for content, computing, data, information, and digital experience professionals

Category: Content management & strategy (Page 150 of 479)

This category includes editorial and news blog posts related to content management and content strategy. For older, long form reports, papers, and research on these topics see our Resources page.

Content management is a broad topic that refers to the management of unstructured or semi-structured content as a standalone system or a component of another system. Varieties of content management systems (CMS) include: web content management (WCM), enterprise content management (ECM), component content management (CCM), and digital asset management (DAM) systems. Content management systems are also now widely marketed as Digital Experience Management (DEM or DXM, DXP), and Customer Experience Management (CEM or CXM) systems or platforms, and may include additional marketing technology functions.

Content strategy topics include information architecture, content and information models, content globalization, and localization.

For some historical perspective see:

https://gilbane.com/gilbane-report-vol-8-num-8-what-is-content-management/

Asbru Web Content Management v7.0 released

The Asbru Web Content Management system v7.0 for .NET, PHP and JSP/Java has been released. This version includes: functionality to check the server settings required/used by the web content management system. As default the initial server system check checks for required configuration files and folders and file create/write permissions. Additional and custom system check scripts can be added to check/report on any additional server settings that may be of interest to users in general and for specific local setups; Adds functionality to automatically analyse your existing website files (HTML files, Dreamweaver templates, images and other files) and import them into the web content management system for migration from an existing “static” HTML file-based to an Asbru Web Content Management system managed website. Dreamweaver templates are identified and converted to templates in the web content management system, and defined “editable regions” are identified and converted to content classes/elements in the web content management system; Adds functionality to define your own custom website settings and special codes to use these settings on your website pages and templates as well as in your website style sheets and scripts; Added “content format” functionality for simple text content items and for exact control over HTML code details for special requirements; Added support for multiple style sheets per page and template; and, added selective database backup and export of specific types of website data. http://asbrusoft.com/

Apples and Oranges: The SaaS Dialog

Most buyers of content technologies understand the key differences between acquiring a content solution as a service and licensing software for installation on servers behind their firewall. Less well understood, however, is the impact of those differences on the acquisition process. With SaaS, you’re not "buying" technology, as with licensed software; you’re entering a services agreement to access a business solution that includes software, applications, and infrastructure. The value proposition is very different, as is the basis for evaluating its fit with the organization’s needs.

The current worldwide economic situation is causing many organizations to take a serious look at SaaS offers as a strategy for continuing to move forward with critical business initiatives. Our latest Gilbane Beacon was developed to help companies evaluate SaaS fairly, with the goal of helping our readers find the best solution, regardless of technology delivery model. Communicating SaaS WCM Value: A Guide to Understanding the Business Case for Software-as-a-Service Solutions for Web Content Management explains why SaaS and licensed software are apples and oranges. The paper identifies the issues that matter–and those that don’t–when considering SaaS solutions for web content management. Available for download on our site.

On Stimulating Open Data Initiatives

Yesterday the big stimulus bill cleared the conference committee that resolves the Senate and House versions. If you remember your civics that means it will be likely to pass in the chambers and then be signed into law by the president.

Included in the bill are billions of dollars for digitizing important information such as medical records or government information. Wow! That is a lot of investment! The thinking is that inaccessible information locked in paper or proprietary formats cost us billions each year in productivity. Wow! That’s a lot of waste! Also, that access to the information could spawn a billions of dollars of new products and services, and therefore income and tax revenue. Wow! That’s a lot of growth!

Many agencies and offices have striven to expose useful official information and reports at the federal and state level. Even so, there is a lot of data still locked away, or incomplete or in difficult to use forms. A while ago a Senate official once told me that they do not maintain a single, complete, accurate, official copy of the US Statutes internally. Even if this is no longer true, the public often relies on the “trusted” versions that are available only through paid online services. Many other data types, like many medical records, only exist in paper.

There are a lot of challenges, such as security and privacy issues, even intellectual property rights issues. But there are a lot of opportunities too. There are thousands of data sources that could be tapped into that are currently locked in paper or proprietary formats.

I don’t think the benefits will come at the expense of commercial services already selling this publicly owned information as some may fear. These online sites provide a service, often emphasizing timeliness or value adds like integrating useful data from different sources, in exchange for their fees. I think a combination of free government open data resources and delivery tools, plus innovative commercial products will emerge. Maybe some easily obtained data may become commoditized, but new ways of accessing and integrating information will emerge. The big information services probably have more to fear from startups than from free government applications and data.

As it happens, I saw a demo yesterday of a tool that took all the activity of a state legislature and unified it under one portal. This allows people to track a bill and all related activity in a single place. For free! The bill working its way through both chambers is connected to related hearing agendas and minutes, which are connected to schedules, with status and other information captured in a concise dashboard-like screen format (there are other services you can pay for which fund the site). Each information component came from a different office and was originally in it’s own specialized format. What we were really looking at was a custom data integration application done with AJAX technology integrating heterogeneous data in a unified view. Very powerful, and yet scalable. The key to its success was strong integration of data, the connections that were used to tie the information together. The vendor collected and filtered the data, converted to a common format, added the linkage and relationship information to provide an integrated view into data. All source data is stored separately and maintained by different offices. Five years ago it would have been a lot more difficult to create the service. Technology has advanced, and the data are increasingly available in manageable forms.

The government produces a lot of information that affect us daily that we, as taxpayers and citizens, actually own, but have limited or no access to. These include statutes and regulations, court cases, census data, scientific data and research, agricultural reports, SEC filings, FDA drug information, taxpayer publications, forms, patent information, health guidelines, etc., etc., etc. The list is really long. I am not even scratching the surface! It also includes more interactive and real-time data, such as geological and water data, whether information, and the status of regulation and legislation changes (like reporting on the progress of the stimulus bill as it worked it way through both chambers). All of these can be made more current, expanded for more coverage, integrated with related materials, validated for accuracy. There are also new opportunities to open up the process of using forums and social media tools for collecting feedback from constituents and experts (like the demo mentioned above). Social media tools may both give people an avenue to express their ideas to their elected officials, as well as be a collection tool to gather raw data that can be analyzed for trends and statistics, which in turn becomes new government data that we can use.

IMHO, this investment in open government data is a powerful catalyst that could actually create or change many jobs or business models. If done well, it could provide significant positive returns, streamline government, open access to more information, and enable new and interesting products and applications. </>

WoodWing Releases Enterprise 6 Content Publishing Platform

WoodWing Software has released Enterprise 6, the latest version of the company’s content publishing platform. Equipped with a new editing application called “Content Station”, Enterprise 6 offers article planning tools, direct access to any type of content repository, and integrated Web delivery functionality. Content Station allows users to create articles for delivery to the Web, print, and mobile devices, and offers out-of-the-box integration with the open-source Web content management system Drupal. Content Station works with Enterprise’s new server plug-ins to allow users to search, select, and retrieve content stored in other third-party repositories such as digital asset management systems, archives, and wire systems. Video, audio, and text files can then be collected into “dossiers”, edited, and set for delivery to a variety of outputs, all from a single user-interface. A built-in XML editor lets authors create documents intended solely for digital output. The content planning application lets managers assign content to users both inside and outside of the office. Enterprise’s Web publishing capabilities feature a direct integration with Drupal. Content authors click on a single button to preview or deliver content directly to Drupal and get information such as page views, ratings, and comments back from the Web CMS. And if something needs to be pulled from the site, editors can simply click “Unpublish”. They don’t have to contact a separate Web editor or navigate through another system’s interface. The server plug-in architecture also allows for any other Web content management system to be connected. http://www.woodwing.com/

Should you Migrate from SGML to XML?

An old colleague of mine from more than a dozen years ago found me on LinkedIn today. And within five minutes we got caught up after a gap of several years. I know, reestablishing lost connections happens all the time on social media sites. I just get a kick out of it every time it happens. But this is the XML blog, not the social media one, so…

My colleague works at a company that has been using SGML & XML technology for more than a 15 years. Their data is still in SGML. They feel they can always export to XML and do not plan to migrate their content and applications to SGML any time soon. The funny thing was that he was slightly embarrassed about still being in SGML.

Wait a minute! There is no reason to think SGML is dead and has to be replaced. Not in general. Maybe for specific applications a business case supports the upgrade, but it doesn’t have to every time. Not yet.

I know of several organizations that still manage data in the SGML they developed years ago. Early adopters, like several big publishers, some state and federal government applications, and financial systems were developed when there was only one choice. SGML, like XML, is a structured format. They are very, very similar. One format can be used to create the other very easily. They already sunk their investment into developing the SGML system and data, as well as training their users in it’s use. The incremental benefits of moving to XML do not support the costs of the migration. Not yet.

This brings up my main point, that structured data can be managed in many forms. These include XML, SGML, XHTML, databases, and probably other forms. The data may be structured, follow rules for hierarchy, occurrence and data typing, etc. but not be managed as XML, only exported as XML when needed. My personal opinion is that XML stored in databases provides some of the best combination of structured content management features, but different business needs suggest a variety of approaches may be suitable. Flat files stored in folders and formatted in old school SGML might still be enough and not warrant migration. Then again, it depends on the environment and the business objectives.

When XML first came out, someone coined the phrase that SGML stood for “Sounds Good, Maybe Later” because it was more expensive and difficult to implement. XML is more Web aware and is somewhat more clearly defined and therefore tools operate more consistently. Many organizations that felt SGML could not be justified were able to later justify migrating to XML. Others migrated right away to take advantage of the new tools or related standards. XML does eliminate some features of SGML that never seemed to work right too. It also demands Wellformed data, which reduces ambiguity and simplifies a few things. And tools have come a long way and are much more numerous, as expected.

XML is definitely more successful in terms of number and range of applications and XML adoption is an easier case to make today than SGML was back in the day. But many existing SGML applications still have legs. I would not suggest that a new application start off with SGML today, but I might modify the old saying to “Sounds Good, Migrate Later”.

So, when is it a good idea to migrate from SGML to XML? There are many tools available that do things with XML data better than they do with other structured forms. Many XML tools support SGML as well, but DBMS systems now can managed content as XML data type and use XML XPath nodes in processing. WIKIs and other tools can produce XML content and utilize other standards based on XML, but not SGML that I am aware of. If you want to take advantage of features of Web or XML tools, you might want to start planning your migration. But if your system is operational and stable, the benefits might not yet justify the investment and disruption from migrating. Not yet! </>

“It’s not information overload. It’s filter failure.”

Thanks to Clay Shirky for the title; watch his Web 2.0 Expo NY presentation here.

This is also the mantra for Alltop.com, self-described as a “digital magazine rack” of the Internet. As an analyst I get a lot of stuff and peruse twice the amount I get. Now that I’m twittering (@lciarlone) and face-bookin’ in addition to being a long-time LinkedIn user… well, you know the story. Not enough time in the day.

I’m looking for — and finding — ways to streamline keeping up to date. Alltop’s darn efficient in helping me do that, organizing “stories” by category per hour. And certainly like that our own Content Globalization blog surfaces within All the Top Linguistics News.

Curious on the method and the genesis? Check it out.

Gilbane San Francisco pre-conference workshops posted

The main conference program for Gilbane San Francisco 2009 will be published in a week or two, but the 1/2 day pre-conference workshop descriptions for June 2nd have been posted:

  • How to Select a Web Content Management System
    Instructor: Seth Gottlieb, Principal, Content Here
  • Making SharePoint Work in the Enterprise
    Instructor: Shawn Shell, Principal, Consejo, Inc.
  • Managing the Web: The Fundamentals of Web Operations Management
    Instructor: Lisa Welchman, Founding Partner, Welchman Pierpoint
  • Getting Started with Business Taxonomy Design
    Instructors: Joseph A. Busch, Founder and Principal, & Ron Daniel, Principal, Taxonomy Strategies LLC
  • Sailing the Open Seas of New Media
    Instructor: Chris Brogan, President, New Marketing Labs, LLC
« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑