Curated for content, computing, and digital experience professionals

Category: Content management & strategy (Page 139 of 468)

This category includes editorial and news blog posts related to content management and content strategy. For older, long form reports, papers, and research on these topics see our Resources page.

Content management is a broad topic that refers to the management of unstructured or semi-structured content as a standalone system or a component of another system. Varieties of content management systems (CMS) include: web content management (WCM), enterprise content management (ECM), component content management (CCM), and digital asset management (DAM) systems. Content management systems are also now widely marketed as Digital Experience Management (DEM or DXM, DXP), and Customer Experience Management (CEM or CXM) systems or platforms, and may include additional marketing technology functions.

Content strategy topics include information architecture, content and information models, content globalization, and localization.

For some historical perspective see:

https://gilbane.com/gilbane-report-vol-8-num-8-what-is-content-management/

Ingres Launches Open Source Enterprise Content Management Offering

Ingres Corporation announced the availability of the Ingres Icebreaker Enterprise Content Management (ECM) Appliance. Powered by Alfresco’s open source alternative software for enterprise content management, the Ingres Icebreaker ECM Appliance gives businesses a way to manage business content growth. Like other commercial open source solutions, the Ingres Icebreaker ECM Appliance lets IT purchasers pay only for the software and support they actually need. For the Ingres Icebreaker ECM Appliance, Ingres provides the open source database for a company’s advanced data repository needs, Alfresco provides the content management expertise, and their technology runs on the Ingres Database. It is an appliance that allows developers to bring two open technologies together on the open source Linux operating system. The Ingres Icebreaker ECM Appliance integrates the operating system, the database, and the ECM technology and is installed as a unit, managed as a unit, and maintained as a unit. The Ingres Icebreaker ECM appliance allows users to capture, store, and preserve data, assist in the management of data, and deliver data to customers and partners. To download the Ingres Icebreaker ECM appliance today, please go to http://esd.ingres.com, http://www.ingres.com

 

2009 Gilbane San Francisco Conference to Focus on Business Impact of Global Content Management and Social Media

One of the most closely watched technology trends for 2009 is the need for enterprises to leverage new web platforms while holding down the costs of managing their ever-proliferating content. Even in the midst of the current economic downturn, critical areas related to global business content will see double-digit growth over the next 12 months – including spending on search technologies which is projected to represent almost half of all digital spending by businesses in 2009, along with rising corporate investment in social media tools (Source: Winterberry Group). Reflecting these fundamental shifts in the way enterprises engage with customers and disseminate information, the sixth annual Gilbane San Francisco June 2-4, 2009 at the Westin Hotel in San Francisco – will focus on the timely theme “Where Content Management Meets Social Media.” Produced by The Gilbane Group and Lighthouse Seminars, the 2009 Gilbane San Francisco conference will offer enhanced programs tied to the business issues surrounding a company’s marketing, technical and enterprise content. The conference tracks have been organized around the four major areas of how enterprises use Web and content technologies and where they are most likely to invest, including: Web Business & Engagement; Managing Collaboration & Social Media; Enterprise Content: Searching, Integrating & Publishing; and Content Infrastructure. Gilbane San Francisco brings together industry experts from leading technology, enterprise IT, analyst, and consulting firms who provide attendees with the latest successful content management and new media strategies, technologies and techniques. In addition to the latest best practices, technology coverage within these four tracks will include enterprise and site search; content globalization; semantic technologies; publishing; XML; and social media tools and platforms from Twitter to business blogs, project wikis and microformats. The just-published schedule of conference sessions can be viewed at http://gilbanesf.com/conference-schedule.html.

Pre-conference workshops will feature industry thought leaders covering core topics in web content, new media, Sharepoint and more — the full schedule of workshops can be found at http://gilbanesf.com/workshops.html. IT and business professionals involved in content creation, management, delivery or analytics wishing to attend the conference may register at: http://gilbanesf.com/registration_information.html. Technology solution providers wishing to exhibit or sponsor should visit: http://gilbanesf.com/exhibitors_sponsors.html. Follow the conference on Twitter:
http://twitter.com/gilbanesf

 

Asbru Web Content Management v7.0 released

The Asbru Web Content Management system v7.0 for .NET, PHP and JSP/Java has been released. This version includes: functionality to check the server settings required/used by the web content management system. As default the initial server system check checks for required configuration files and folders and file create/write permissions. Additional and custom system check scripts can be added to check/report on any additional server settings that may be of interest to users in general and for specific local setups; Adds functionality to automatically analyse your existing website files (HTML files, Dreamweaver templates, images and other files) and import them into the web content management system for migration from an existing “static” HTML file-based to an Asbru Web Content Management system managed website. Dreamweaver templates are identified and converted to templates in the web content management system, and defined “editable regions” are identified and converted to content classes/elements in the web content management system; Adds functionality to define your own custom website settings and special codes to use these settings on your website pages and templates as well as in your website style sheets and scripts; Added “content format” functionality for simple text content items and for exact control over HTML code details for special requirements; Added support for multiple style sheets per page and template; and, added selective database backup and export of specific types of website data. http://asbrusoft.com/

Apples and Oranges: The SaaS Dialog

Most buyers of content technologies understand the key differences between acquiring a content solution as a service and licensing software for installation on servers behind their firewall. Less well understood, however, is the impact of those differences on the acquisition process. With SaaS, you’re not "buying" technology, as with licensed software; you’re entering a services agreement to access a business solution that includes software, applications, and infrastructure. The value proposition is very different, as is the basis for evaluating its fit with the organization’s needs.

The current worldwide economic situation is causing many organizations to take a serious look at SaaS offers as a strategy for continuing to move forward with critical business initiatives. Our latest Gilbane Beacon was developed to help companies evaluate SaaS fairly, with the goal of helping our readers find the best solution, regardless of technology delivery model. Communicating SaaS WCM Value: A Guide to Understanding the Business Case for Software-as-a-Service Solutions for Web Content Management explains why SaaS and licensed software are apples and oranges. The paper identifies the issues that matter–and those that don’t–when considering SaaS solutions for web content management. Available for download on our site.

On Stimulating Open Data Initiatives

Yesterday the big stimulus bill cleared the conference committee that resolves the Senate and House versions. If you remember your civics that means it will be likely to pass in the chambers and then be signed into law by the president.

Included in the bill are billions of dollars for digitizing important information such as medical records or government information. Wow! That is a lot of investment! The thinking is that inaccessible information locked in paper or proprietary formats cost us billions each year in productivity. Wow! That’s a lot of waste! Also, that access to the information could spawn a billions of dollars of new products and services, and therefore income and tax revenue. Wow! That’s a lot of growth!

Many agencies and offices have striven to expose useful official information and reports at the federal and state level. Even so, there is a lot of data still locked away, or incomplete or in difficult to use forms. A while ago a Senate official once told me that they do not maintain a single, complete, accurate, official copy of the US Statutes internally. Even if this is no longer true, the public often relies on the “trusted” versions that are available only through paid online services. Many other data types, like many medical records, only exist in paper.

There are a lot of challenges, such as security and privacy issues, even intellectual property rights issues. But there are a lot of opportunities too. There are thousands of data sources that could be tapped into that are currently locked in paper or proprietary formats.

I don’t think the benefits will come at the expense of commercial services already selling this publicly owned information as some may fear. These online sites provide a service, often emphasizing timeliness or value adds like integrating useful data from different sources, in exchange for their fees. I think a combination of free government open data resources and delivery tools, plus innovative commercial products will emerge. Maybe some easily obtained data may become commoditized, but new ways of accessing and integrating information will emerge. The big information services probably have more to fear from startups than from free government applications and data.

As it happens, I saw a demo yesterday of a tool that took all the activity of a state legislature and unified it under one portal. This allows people to track a bill and all related activity in a single place. For free! The bill working its way through both chambers is connected to related hearing agendas and minutes, which are connected to schedules, with status and other information captured in a concise dashboard-like screen format (there are other services you can pay for which fund the site). Each information component came from a different office and was originally in it’s own specialized format. What we were really looking at was a custom data integration application done with AJAX technology integrating heterogeneous data in a unified view. Very powerful, and yet scalable. The key to its success was strong integration of data, the connections that were used to tie the information together. The vendor collected and filtered the data, converted to a common format, added the linkage and relationship information to provide an integrated view into data. All source data is stored separately and maintained by different offices. Five years ago it would have been a lot more difficult to create the service. Technology has advanced, and the data are increasingly available in manageable forms.

The government produces a lot of information that affect us daily that we, as taxpayers and citizens, actually own, but have limited or no access to. These include statutes and regulations, court cases, census data, scientific data and research, agricultural reports, SEC filings, FDA drug information, taxpayer publications, forms, patent information, health guidelines, etc., etc., etc. The list is really long. I am not even scratching the surface! It also includes more interactive and real-time data, such as geological and water data, whether information, and the status of regulation and legislation changes (like reporting on the progress of the stimulus bill as it worked it way through both chambers). All of these can be made more current, expanded for more coverage, integrated with related materials, validated for accuracy. There are also new opportunities to open up the process of using forums and social media tools for collecting feedback from constituents and experts (like the demo mentioned above). Social media tools may both give people an avenue to express their ideas to their elected officials, as well as be a collection tool to gather raw data that can be analyzed for trends and statistics, which in turn becomes new government data that we can use.

IMHO, this investment in open government data is a powerful catalyst that could actually create or change many jobs or business models. If done well, it could provide significant positive returns, streamline government, open access to more information, and enable new and interesting products and applications. </>

WoodWing Releases Enterprise 6 Content Publishing Platform

WoodWing Software has released Enterprise 6, the latest version of the company’s content publishing platform. Equipped with a new editing application called “Content Station”, Enterprise 6 offers article planning tools, direct access to any type of content repository, and integrated Web delivery functionality. Content Station allows users to create articles for delivery to the Web, print, and mobile devices, and offers out-of-the-box integration with the open-source Web content management system Drupal. Content Station works with Enterprise’s new server plug-ins to allow users to search, select, and retrieve content stored in other third-party repositories such as digital asset management systems, archives, and wire systems. Video, audio, and text files can then be collected into “dossiers”, edited, and set for delivery to a variety of outputs, all from a single user-interface. A built-in XML editor lets authors create documents intended solely for digital output. The content planning application lets managers assign content to users both inside and outside of the office. Enterprise’s Web publishing capabilities feature a direct integration with Drupal. Content authors click on a single button to preview or deliver content directly to Drupal and get information such as page views, ratings, and comments back from the Web CMS. And if something needs to be pulled from the site, editors can simply click “Unpublish”. They don’t have to contact a separate Web editor or navigate through another system’s interface. The server plug-in architecture also allows for any other Web content management system to be connected. http://www.woodwing.com/

Should you Migrate from SGML to XML?

An old colleague of mine from more than a dozen years ago found me on LinkedIn today. And within five minutes we got caught up after a gap of several years. I know, reestablishing lost connections happens all the time on social media sites. I just get a kick out of it every time it happens. But this is the XML blog, not the social media one, so…

My colleague works at a company that has been using SGML & XML technology for more than a 15 years. Their data is still in SGML. They feel they can always export to XML and do not plan to migrate their content and applications to SGML any time soon. The funny thing was that he was slightly embarrassed about still being in SGML.

Wait a minute! There is no reason to think SGML is dead and has to be replaced. Not in general. Maybe for specific applications a business case supports the upgrade, but it doesn’t have to every time. Not yet.

I know of several organizations that still manage data in the SGML they developed years ago. Early adopters, like several big publishers, some state and federal government applications, and financial systems were developed when there was only one choice. SGML, like XML, is a structured format. They are very, very similar. One format can be used to create the other very easily. They already sunk their investment into developing the SGML system and data, as well as training their users in it’s use. The incremental benefits of moving to XML do not support the costs of the migration. Not yet.

This brings up my main point, that structured data can be managed in many forms. These include XML, SGML, XHTML, databases, and probably other forms. The data may be structured, follow rules for hierarchy, occurrence and data typing, etc. but not be managed as XML, only exported as XML when needed. My personal opinion is that XML stored in databases provides some of the best combination of structured content management features, but different business needs suggest a variety of approaches may be suitable. Flat files stored in folders and formatted in old school SGML might still be enough and not warrant migration. Then again, it depends on the environment and the business objectives.

When XML first came out, someone coined the phrase that SGML stood for “Sounds Good, Maybe Later” because it was more expensive and difficult to implement. XML is more Web aware and is somewhat more clearly defined and therefore tools operate more consistently. Many organizations that felt SGML could not be justified were able to later justify migrating to XML. Others migrated right away to take advantage of the new tools or related standards. XML does eliminate some features of SGML that never seemed to work right too. It also demands Wellformed data, which reduces ambiguity and simplifies a few things. And tools have come a long way and are much more numerous, as expected.

XML is definitely more successful in terms of number and range of applications and XML adoption is an easier case to make today than SGML was back in the day. But many existing SGML applications still have legs. I would not suggest that a new application start off with SGML today, but I might modify the old saying to “Sounds Good, Migrate Later”.

So, when is it a good idea to migrate from SGML to XML? There are many tools available that do things with XML data better than they do with other structured forms. Many XML tools support SGML as well, but DBMS systems now can managed content as XML data type and use XML XPath nodes in processing. WIKIs and other tools can produce XML content and utilize other standards based on XML, but not SGML that I am aware of. If you want to take advantage of features of Web or XML tools, you might want to start planning your migration. But if your system is operational and stable, the benefits might not yet justify the investment and disruption from migrating. Not yet! </>

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑