Gilbane

Conferences & Advisor

Category: Enterprise Publishing (page 1 of 2)

Understanding the Smart Content Technology Landscape

If you have been following recent XML Technologies blog entries, you will notice we have been talking a lot lately about XML Smart Content, what it is and the benefits it can bring to an organization. These include flexible, dynamic assembly for delivery to different audiences, search optimization to improve customer experience, and improvements for distributed collaboration. Great targets to aim for, but you may ask are we ready to pursue these opportunities? It might help to better understand the technology landscape involved in creating and delivering smart content.

The figure below illustrates the technology landscape for smart content. At the center are fundamental XML technologies for creating modular content, managing it as discrete chunks (with or without a formal content management system), and publishing it in an organized fashion. These are the basic technologies for “one source, one output” applications, sometimes referred to as Single Source Publishing (SSP) systems.

Smart Content landscape

Smart Content Landscape

The innermost ring contains capabilities that are needed even when using a dedicated word processor or layout tool, including editing, rendering, and some limited content storage capabilities. In the middle ring are the technologies that enable single-sourcing content components for reuse in multiple outputs. They include a more robust content management environment, often with workflow management tools, as well as multi-channel formatting and delivery capabilities and structured editing tools. The outermost ring includes the technologies for smart content applications, which are described below in more detail.

It is good to note that smart content solutions rely on structured editing, component management, and multi-channel delivery as foundational capabilities, augmented with content enrichment, topic component assembly, and social publishing capabilities across a distributed network. Descriptions of the additional capabilities needed for smart content applications follow.

Content Enrichment / Metadata Management: Once a descriptive metadata taxonomy is created or adopted, its use for content enrichment will depend on tools for analyzing and/or applying the metadata. These can be manual dialogs, automated scripts and crawlers, or a combination of approaches. Automated scripts can be created to interrogate the content to determine what it is about and to extract key information for use as metadata. Automated tools are efficient and scalable, but generally do not apply metadata with the same accuracy as manual processes. Manual processes, while ensuring better enrichment, are labor intensive and not scalable for large volumes of content. A combination of manual and automated processes and tools is the most likely approach in a smart content environment. Taxonomies may be extensible over time and can require administrative tools for editorial control and term management.

Component Discovery / Assembly: Once data has been enriched, tools for searching and selecting content based on the enrichment criteria will enable more precise discovery and access. Search mechanisms can use metadata to improve search results compared to full text searching. Information architects and organizers of content can use smart searching to discover what content exists, and what still needs to be developed to proactively manage and curate the content. These same discovery and searching capabilities can be used to automatically create delivery maps and dynamically assemble content organized using them.

Distributed Collaboration / Social Publishing: Componentized information lends itself to a more granular update and maintenance process, enabling several users to simultaneously access topics that may appear in a single deliverable form to reduce schedules. Subject matter experts, both remote and local, may be included in review and content creation processes at key steps. Users of the information may want to “self-organize” the content of greatest interest to them, and even augment or comment upon specific topics. A distributed social publishing capability will enable a broader range of contributors to participate in the creation, review and updating of content in new ways.

Federated Content Management / Access: Smart content solutions can integrate content without duplicating it in multiple places, rather accessing it across the network in the original storage repository. This federated content approach requires the repositories to have integration capabilities to access content stored in other systems, platforms, and environments. A federated system architecture will rely on interoperability standards (such as CMIS), system agnostic expressions of data models (such as XML Schemas), and a robust network infrastructure (such as the Internet).

These capabilities address a broader range of business activity and, therefore, fulfill more business requirements than single-source content solutions. Assessing your ability to implement these capabilities is essential in evaluating your organizations readiness for a smart content solution.

What’s Next with Smart Content?

Over the past few weeks, since publishing Smart Content in the Enterprise, I’ve had several fascinating lunchtime conversations with colleagues concerned about content technologies. Our exchanges wind up with a familiar refrain that goes something like this. “Geoffrey, you have great insights about smart content but what am I supposed to do with all this information?” Ah, it’s the damning with faint praise gambit that often signals an analysis paralysis conundrum for decision-making.

Let me make one thing perfectly clear — I do not have an out-of-the-box prescription for a solution. It’s not simply a matter of focusing on your customer experience, optimizing your content for search, investing in a component content management platform, or adopting DITA – although, depending on the situation, I may recommend some combination of these items as part of a smart content strategy.

For me, smart content remains a work in progress. I expect to develop the prescriptive road map in the months ahead. Here’s a quick take on where I am right now.

  • For publishers, it’s all about transforming the publishing paradigm through content enrichment – defining the appropriate level of granularity and then adding the semantic metadata for automated processing.
  • For application developers, it’s all about getting the information architecture right and ensuring that it’s extensible. There needs to be sensible storage, the right editing and management tools, multiple methods for organizing content, as well as a flexible rendering and production environment.
  • For business leaders and decision makers, there needs to be an upfront investment in the right set of content technologies that will increase profits, reduce operating costs, and mitigate risks. No, I am not talking about rocket science. But you do need a technology strategy and a business plan.

As highlighted by the case studies included in the report, I can point to multiple examples where organizations have done the right things to produce notable results. Dale and I will continue the smart content discussions at the Gilbane Boston conference right after Thanksgiving, both through our preconference workshop, and at a conference session “Smart Content in the Real World: Case Studies and Real Results.”

We are also launching a Smart Content Readiness Service, where we will engage with organizations on a consulting basis to identify:

  • The business drivers where smart content will ensure competitive advantage when distributing business information to customers and stakeholders
  • The technologies, tools, and skills required to componentized content, and target distribution to various audiences using multiple devices
  • The operational roles and governance needed to support smart content development and deployment across an organization
  • The implementation planning strategies and challenges to upgrade content and creation and delivery environments

Please contact me if you are interested in learning more.

In short, to answer my lunchtime colleagues, I cannot (yet) prescribe a fully baked solution. It’s too early for the recipes and the cookbook. But I do believe that the business opportunities and benefits are readily at hand. At this point, I would invite you to join the discussion by letting me know what you expect, what approaches you’ve tried, where you’ve wound up, what you think needs to come next – and how we might help you.

The Pull of Content Value

Traditionally, publishing is a pushy process. When I have something to say, I write it down. Perhaps I revise it, check with colleagues, and verify my facts with appropriate authorities. Then I publish it, and move on to the next thing – without directly interacting with my audience and stakeholders. Whether I distribute the content electronically or in a hard copy format, I leave it to my readers to determine the value of whatever I publish.

However, as we describe in our recently completed report Smart Content in the Enterprise, XML applications can transform this conventional publishing paradigm. By smart content, we mean content that is granular at the appropriate level, semantically rich, useful across applications, and meaningful for collaborative interaction.

From a business perspective, smart content adds value to published information in new and compelling ways. Let’s consider the experiences of NetApp and Warrior Gateway, two of the organizations featured in our report.

NetApp
As a provider of storage and data management solutions, NetApp has invested a lot of time and effort embracing DITA and restructuring its technical documentation. By systematically tagging and managing content components, and by focusing on the underlying content development processes, writers and editors can keep up with the pace of product releases.

But there is more to this publishing process orientation. Beyond simply producing product information faster and cheaper, NetApp is poised to make publishing better. The company can now easily support its reseller partners by providing them with the DITA tagged content that they can directly incorporate into their own OEM solutions. Resellers’ customers get just the information they need, directly from the source. With its XML application, NetApp incorporates its partners and stakeholders into its information value chain.

Warrior Gateway
As a content aggregator, Warrior Gateway collects, organizes, enriches, and redistributes content about a wide range of health, welfare, and veteran-related services to soldiers, veterans, and their families. Rather than simply compiling an online catalog of service providers’ listings, Warrior Gateway restructures the content that government, military, and local organizations produce, and enriches it by adding veteran-related categories and other information. Furthermore, Warrior Gateway adds a social dimension by encouraging contributions from veterans and family members.

Once stored within the XML application powering Warrior Gateway, the content is easily reorganized and reclassified to provide the veterans’ perspective about areas of interest and importance. Volunteers working with Warrior Gateway can add new categories when necessary. Service providers can claim their profile and improve their own data details. Even the public users can contribute to content to the gateway, a crowd sourcing strategy to efficiently collect feedback from users. With contributions from multiple stakeholders, the published listings can be enriched over time without requiring a large internal staff to add the extra information.

Capturing New Business Value
There’s a lot more detail about how the XML applications work in our case studies – I recommend that you check them out.

What I find intriguing is the range of promising and potentially profitable business models engendered by smart content.  Enterprise publishers have new options and can go beyond simply pushing content through a publishing process. Now they can build on their investments, and capture the pull of content value.

How Smart Content Aids Distributed Collaboration

Authoring in a structured text environment has traditionally been done with dedicated structured editors. These tools enable validation and user assisted markup features that help the user create complete and valid content. But these structured editors are somewhat complicated and unusual and require training in their use for the user to become proficient. The learning curve is not very steep but it does exist.

Many organizations have come to see documentation departments as a process bottleneck and try to engage others throughout the enterprise in the content creation and review processes. Engineers and developers can contribute to documentation and have a unique technical perspective. Installation and support personnel are on the front lines and have unique insight into how the product and related documentation is used. Telephone operators not only need the information at their fingertips, but can also augment it with comments and ides that occur while supporting users. Third-party partners and reviewers may also have a unique perspective and role to play in a distributed, collaborative content creation, management, review, and delivery ecosystem.

Our recently completed research on XML Smart Content in the Enterprise indicates that as we strive to move content creation and management out of the documentation department silo, we will also need to consider how the data is encoded and the usefulness of the data model in meeting our expanded business requirements. Smart content is multipurpose content designed with several uses in mind. Smart content is modular to support being assembled in a variety of forms. And smart content is structured content that has been enriched with semantic information to better identify it’s topic and role to aide processing and searching. For these reasons, smart content also improves distributed collaboration. Let me elaborate.

One of the challenges for distributed collaboration is the infrequency of user participation and therefore, unfamiliarity with structured editing tools. It makes sense to simplify the editing process and tools for infrequent users. They can’t always take a refresher course in the editor and it’s features. They may be working remotely, even on a customer site installing equipment or software. These infrequent users need structured editing tools that are designed for them. These collaboration tools need to be intuitive and easy to figure out, easily accessible from just about anywhere, and should be affordable and have flexible licensing to allow a larger number of users to participate in the management of the content. This usually means one of two things: either the editor will be a plug in to another popular word processing system (e.g., MS Word), or it will be accessed though a thin-client browser, like a Wiki editor. In some environments, it is possible that both may be need in addition to traditional structured editing tools. Smart content modularity and enrichment allows flexibility in editing tools and process design. This allows the  use of a variety of editing tools and flexibility in process design, and therefore expanding who can collaborate from throughout the enterprise.

Also, infrequent contributors may not be able to master navigating and operating within a  complex repository and workflow environment either for the same familiarity reasons. Serving up information to a remote collaborator might be enhanced with keywords and other metadata that is designed to optimize searching and access to the content. Even a little metadata can provide a lot of simplicity to an infrequent user. Product codes, version information, and a couple of dates would allow a user to hone in on the likely content topics and select content to edit from a well targeted list of search results. Relationships between content modules that are indicated in metadata can alert a user that when one object is updated, other related objects may need to be reviewed for potential update as well.

It is becoming increasingly clear that there is no one model for XML or smart content creation and editing. Just as a carpenter may have several saws, each designed for a particular type of cut, a robust smart content structured content environment may have more than one editor in use. It behooves us to design our systems and tools to meet the desired business processes and user functionality, rather than limit our processes to the features of one tool.

Social Publishing with Drupal — New GG Whitepaper

I just published a new white paper, Social Publishing with Drupal, sponsored by Acquia and also available here. We forget that publishing and blogging (including this post) are stove-piped operations. But what would happen if we could intelligently keep track of all these disparate threads, combining the authoritative content from trusted sources with insights from friends and colleagues, organized contextually around the ways we think about things and make decisions? Social publishing is a new lens for delivering business value.

Here’s the executive summary for the white paper. Click the link above if you’d like to learn more. What’s the future of social publishing? Let’s start a debate. /geoff

EXECUTIVE SUMMARY
Social publishing combines groomed and authoritative content, produced by an organization and emphasizing its core messages, with user-generated content that customers contribute via blogs, wikis, and social media tools. Drupal is an example of a social publishing platform, developed and maintained as an open source project, and delivered at an affordable cost.

Drupal is now deployed in major media companies, high technology firms, universities, magazine publishers, government agencies (including the White House), research groups, and non-profit organizations. Whether it is in a commercial, non-profit, or government setting, organizations rely on Drupal to project their presence over the web and to channel the interactive experiences that foster communities of contributors.

By leveraging Drupal’s capabilities as a social publishing platform, organizations are able to reinforce their branded experiences and deliver relevant content to their customers and stakeholders. By exploiting Drupal as an open source project, developers supporting these organizations can easily enhance and extend Drupal’s capabilities, and introduce innovative modes of interactivity that meet specific business requirements.

Drupal is an attractive investment with substantial business benefits. Organization can keep their license and support costs modest by building on an open source project. Organizations can leverage the collective expertise of Drupal developers to solve immediate publishing problems. By relying on Drupal, organizations can stay abreast of the rapid technology changes when building competitive solutions for the digital age.

XML and Belly Buttons: How to “Sell” XML

Anyone who works with XML has probably had to “sell” the idea of using the standard instead of alternative approaches, whether as an internal evangelist of XML or in a formal sales role. We have developed some pretty convincing arguments, such as automating redundant processes, quality checking and validation of content, reuse of content using a single source publishing approach, and so on. These types of benefits are easily understood by the technical documentation department or developers and administrators in the IT group. And they are easy arguments to make.

Even so, that leaves a lot of people who can benefit from the technology but may never need know that XML is part of the solution. The rest of the enterprise may not be in tune with the challenges faced by the documentation department, and instead focus on other aspects of running a business, like customer support, manufacturing, fulfillment, or finance, etc.. If you tell them the software solution you want to buy has “XML Inside” they may stare off into space and let their eyes glaze over, even fall asleep. But if you tell them you have a way to reduce expensive customer support phone calls by making improvements to their public-facing Web content and capabilities, you might get more of their attention.

I have been around the XML community for a very long time, and we tend to look into our belly buttons for the meaning of XML. This is often doen at the expense of looking around us and seeing what problems are out there before we start talking about solutions to apply to them. Everything looks like a nail because we have this really nifty hammer called XML. But when CD-ROMs were introduced, people didn’t run around talking about the benefits of ISO 9660 (the standard that dictates how data is written to a CD). Okay they did at first to other technologists and executives in big companies adopting the standard, but rarely did the end consumer hear about the standard. Instead, we talked about the massive increase in data storage, and the flexibility of a consistent data storage format across operating systems. So we need to remember that XML is not what we want to accomplish, but rather how we may get things done to meet our goals. Therefore, we need to understand and describe our requirements in terms of these business drivers, not the tools we use to address them.

Part of the problem is that there are several potential audiences for the XML evangelism message, each with their own set of concerns and domain-specific challenges. End users want the ability to get the work out the door in a timely manner, at the right quality level, and that the tools are easy to use. Line Managers may add sensitivity to pricing, performance, maintenance and deployment costs, etc. These types of concerns I would classify as tactical departmental concerns focusing on operational efficiency (bottom line).

Meanwhile Product Managers, Sales, Customer Service, Fulfillment, Finance, etc. are more geared toward enterprise goals and strategies such as reducing product support costs, and increasing revenue, in addition to operational efficiency. Even stated goals like synchronizing releases of software and documentation, making data more flexible and robust to enable new Web and mobile delivery options, are really only supporting the efforts to achieve the first two objectives of better customer service and increased sales, which I would classify as strategic enterprise concerns.

The deft XML evangelist, to succeed in the enterprise discussion, needs to know about a lot more than the technology and processes in the documentation department, or he or she will be limited to tactical, incremental improvements. The boss may want, instead, to focus on how the data can be improved to make robust Web content that can be dynamically assembled according to the viewer’s profile. Or how critical updates can be delivered electronically and as fast as possible, while the complete collection of information is prepared for more time consuming, but equally valuable printed delivery in a multi-volume set of books. Or how content can be queried, rearranged, reformatted and delivered in a completely new way to increase revenue. Or how a business system can automatically generate financial reporting information in a form accurate and suitable enough for submission to the government, but without the army of documentation labor used previously.

At Gilbane we often talk about the maturity of XML approaches, not unlike the maturity model for software. We haven’t finalized a spectrum of maturity levels yet, but I think of XML applications as ad hoc, departmental, and enterprise in nature. Ad hoc is where someone decides to use an XML format for a simple process, maybe configuration files driving printers or other applications. Often XML is adopted with no formal training and little knowledge outside of the domain in which it is being applied.

Departmental applications tend to focus on operational efficiency, especially as it relates to creating and distributing textual content. Departmental applications are governed by a single department head but may interact with other groups and delivery feeds, but can standalone in their own environment.  An enterprise application of XML would need governance from several departments or information partners, and would focus on customer or compliance facing issues and possibly growth of the business. They tend to have to work within a broader framework of applications and standards.

Each of these three application types requires different planning and justification. For ad hoc use of XML it is usually up to the individual developer to decide if XML is the right format, if a schema will be needed, and what the markup and data model are, etc. Very little “selling” is needed here except as friendly debate between developers, architects and line managers. Usually these applications can be tweaked and changed easily with little impact beyond local considerations.

Departmental application of XML usually requires a team representing all stakeholders involved in the process, from users to consumers of the info. There may be some departmental architectural standards, but exceptions to these are easier to accommodate than with enterprise applications. A careful leader of a departmental application will look upstream and down stream in the information flow to include some of their needs. Also, they need to realize that the editing process in their department may become more complex and require additional skills and resources, but that these drawbacks are more than offset but savings in other areas, such as page layout, or conversion to Web formats which can be highly automated. Don’t forget to explain these benefits to the users whose work just got a little more complicated!

An Enterprise solution is by definition tied to the business drivers of the enterprise, even if that means some decisions may seem like they come at the expense of one department over another. This is where an evangelist could be useful, but not if they only focus on XML instead of the benefits it provides. Executives need to know how much revenue can be increased, how many problem reports can be avoided in customer service, and whether they can meet regulatory compliance guidelines, etc. This is a much more complicated set of issues with dependencies on and agreement with other departments needed to be successful. If you can’t provide these types of answers, you may be stuck in departmental thinking.

XML may be the center of my universe (my belly button so to speak), but it is usually not the center of my project’s sponsor’s universe. I have to have the right message to covince them to make signifiaccnt investment in the way their enterprise operates.  </>

New Workshop on Implementing DITA

As part of our Gilbane Onsite Technology Strategy Workshop Series, we are happy to announce a new workshop, Implementing DITA.

Course Description

DITA, the Darwin Information Typing Architecture is an emerging standard for content creation, management, and distribution. How does DITA differ from other XML applications? Will it work for my vertical industry’s content? From technical documentation, to training manuals, from scientific papers to statutory publishing. DITA addresses one of the most challenging aspects of XML implementation, developing a data model that can be user and shared with information partners. Even so, DITA implementation requires effective process, software, and content management strategies to achieve the benefits promised by the DITA business case, cost-effective, reusable content. This seminar will familiarize you with DITA concepts and terminology, describe business benefits, implementation challenges, and best practices for adopting DITA. How DITA enables key business processes will be explored, including content management, formatting & publishing, multi-lingual localization, and reusable open content. Attendees will be able to participate in developing an effective DITA content management strategy.

Audience

This is an introductory course suitable for anyone looking to better understand DITA standard, terminology, processes, benefits, and best practices. A basic understanding of computer processing applications and production processes is helpful. Familiarity with XML concepts and publishing helpful, but not required. No programming experience required.

Topics Covered

  • The Business Drivers for DITA Adoption

  • DITA Concepts and Terminology

  • The DITA Content Model

  • Organizing Content with DITA Maps

  • Processing, Storing & Publishing DITA Content

  • DITA Creation, Management & Processing Tools

  • Multi-lingual Publishing with DITA

  • Extending DITA to work with Other Data Standards

  • Best Practices & Pitfalls for DITA Implementation

For more information and to customize a workshop just for your organization, please contact Ralph Marto by email or at +617.497.9443 x117

DPCI Announces Partnership with Mark Logic to Deliver XML-Based Content Publishing Solutions

DPCI, a provider of integrated technology solutions for organizations that need to publish content to Web, print, and mobile channels, announced that it has partnered with Mark Logic Corporation to deliver XML-based content publishing solutions. The company’s product, MarkLogic Server, allows customers to store, manage, search, and dynamically deliver content. Addressing the growing need for XML-based content management systems, DPCI and Mark Logic have been collaborating on several projects including one that required integration with Amazon’s Kindle reading device. Built specifically for content, MarkLogic Server provides a single solution for search and content delivery that allows customers to build digital content products: rrom task-sensitive online content delivery applications that place content in users’ workflows to digital asset distribution systems that automate content delivery; from custom publishing applications that maximize content re-use and repurposing to content assembly solutions to integrate content. http://www.marklogic.com, http://www.databasepublish.com

« Older posts

© 2019 Gilbane

Theme by Anders NorenUp ↑