The Gilbane Advisor

Curated for content, computing, data, information, and digital experience professionals

Page 334 of 931

Gilbane San Francisco

Well, I’d have to say it was a very good conference. Attendance was up, San Francisco was sunny and warm, and I thought the sessions were very good.

I had the advantage/disadvantage of being one of the track chairs of the publishing track, which meant that I had to attend all six publishing sessions. I managed to catch the keynotes as well, which were jam-packed. But I’ll leave it to others to discuss the other tracks and sessions.

We tried to expand the publishing track this spring from a focus solely on automated publishing (we brought that topic down to one session). The subject is important, and very relevant to the rest of the Gilbane conference content, but we had the clear intention of making the publishing track much broader in scope than it had been before.

Steve Paxhia very kindly allowed me to lead off with what was really a dual session. I introduced my new web site: www.thefutureofpublishing.com, but also spoke more broadly on the subject. I’ve got a ton of material that I’m slowly loading onto the Web site, and nine years of research behind it.

We then moved to “Publishing Automation: What Can You Do Today?” and had three great speakers tackle the topic. OK, they were all book publishers, but each had a markedly different approach, and I decided, in the end, that listening to three approaches to a similar problem might be more interesting than three approaches to completely different challenges.

The topic “How Will Internet Communities and Collaboration Technologies Change Publishing Best Practices?” was a tough one, and Steve and his speakers handled it very well. This subject is so slippery. Do you need to create a community on your Web site? What is the ideal extent of collaboration? I was left with the clear sense that community and collaboration are essential to nearly all publishing Web sites.
We then featured two sessions on cross-media publishing strategies. Bill Rosenblatt is the expert. It struck me that there are still a lot of unanswered questions around cross-media publishing, but absolutely no question about the necessity and reality of this phenomenon — not only is it essential, but the tools are there today.

Bill Trippe wasn’t available to moderate “Publishing for International Audiences: Top Challenges and Best Practices” but he had selected two great speakers: Stéphane Dayras, technical services manager for Quark in Europe, and Ben Martin, senior analyst at Flatirons Solutions in Colorado. They were the perfect pairing. Stéphane introduced the topic broadly, and offered a true international perspective. Ben is “the scientist,” and showed extensive details of planning for this tough challenge. Most of the audience stayed behind for an extra half-hour.

Where do we go from here? I’d love to hear from attendees with suggestions. I think we still face a challenge effectively blending this publishing content into the broader Gilbane conference. But I think we’ve given it a home.

Open Publication Structure 2.0 Elevated to IDPF Member & Public Review

Nick Bogarty from the International Digital Publishing Forum (IDPF) writes that the Open Publication Structure (OPS) 2.0 has been elevated for IDPF Member and Public Review. The review period will begin today and extend for 30 days ending on Wednesday, May 16th, 2007. The IDPF strongly encourages feedback from potential users, developers and others, whether IDPF members or not, for the sake of improving the interoperability and quality of IDPF work.
You can view the draft document here. Feedback on the draft specifications should be made by posting a reply to the forum topic, and sample OPS 2.0 files can be found for download here.

Day Ships Latest Versions of Communiqué and CRX

Day Software (SWX:DAYN)(OTC:DYIHY) announced the availability of Communiqué 4.2 and Content Repository Extreme (CRX) 1.3.1, a native JCR (JSR 170) standard compliant enterprise content management solution and Java Content Repository. These latest releases of Communiqué and CRX contain enhancements that are directed toward an even better usability for authors, administrators, and developers, and improvements in scalability and performance. Specific enhancements include active clustering of author instances, SAP portal integration and synchronization with Day’s connector products providing an easy start to connect to third party repositories by using JCR connectors. Based on the cluster functionality in the CRX 1.3.1 release, Communiqué 4.2 supports active clustering of author instances. With this it is possible to have several Communiqué servers combined to form a cluster and provide one Communiqué Author instance. As an additional language for the user interface, Japanese is now maintained in the product. Portal Adapter for SAP Portal 6.0 allows integration of Communiqué content and functionality into a SAP Portal environment. Out-of-the-box portlets can be used to display Communiqué managed content that is dynamically associated to the iView. Existing Communiqué projects can be migrated “in place” with help of an update installer, leaving the existing deployments and architecture intact. http://www.day.com

Aging: Web Years Are Worse Than Dog Years

This was one of my favorite quotes from Sun’s April 10th presentation at Gilbane San Francisco, titled Managing Content Globally: What Works, What Doesn’t. Given by Jed Michnowicz, Engineering Lead, and Youngmin Radochonski, Globalization Program Manager, the presentation opened the LISA Forum on Day 1 of the conference.

Jed and Youngmin nailed it when they defined three key components of a global content platform: content management, translation, and delivery. As they outlined the struggles of legacy challenges in all three areas, a pattern of checklist items for the audience quickly surfaced. Lack of metadata. “Siloed” mindsets, workflow, and content repositories. Static Web server content delivery. Inconsistent messaging. Slow time to market. Cost overruns. As moderator, it is always interesting to scan the faces in the crowd for reactions. During this part of the presentation the response was palpable: “got that, got that, and yes, definitely got that.”

They also nailed it when they moved to the “here’s the good news” part of the presentation. Global awareness throughout the organization. Process alignment and consistency. Separation of content from presentation. Translation memory management and sharing. Integrated content and translation workflows. Automated, Web services-based content distribution. They described what is most definitely a “Level 2+” integration from a technology perspective. At this point, the audience response was equally palpable: “want that, want that, and yes, definitely want that.”

Wrapping up the success story with lessons learned (according to people, process, and technology categories; be still my heart!) Jed and Youngmin also noted that Sun, like most organizations, is still learning. Some of the questions they posed — which we will continue to explore on this blog — included:

  • What takes precedence when solving for people, process and technology?
  • What is the proper globalization strategy and who defines it?
  • Can a single solution work for everyone?

On behalf of The Gilbane Group and LISA, we thank these excellent presenters for a job well done. This presentation will be available here this week; check out one of my other favorite quotes emblazoned on the t-shirt on the last slide.

DITA, DocBook, and ODF Interoperability?

As our readers know, we are long-time advocates of open-document formats. Over the past couple of years, we have written a great deal about DITA and formats like ODF, but we also have a lot of experience and interest in DocBook and vertical DTDs such as J2008 for the automotive industry and the various DOD standards. We are clearly reaching a point where interoperability among these standards has become an issue. Organizations are more diverse, more likely to be sharing content between operating groups and with other organizations, and more likely to be sourcing content from a variety of partners, customers, and suppliers. Needless to say, not all of these sources of content will be using the same XML vocabulary; indeed, even two organizations using DITA, for example, will likely have specialized DITA differently.

With this need for interoperability in mind, OASIS has announced a new discussion list, regarding a possible new OASIS Document Standards Interoperability Technical Committee (TC). Details on the list, including how to subscribe, can be found here.

FatWire Releases Content Server 7

FatWire Software announced the general availability of FatWire Content Server 7, the latest release of its Web Content Management platform. Content Server now has three interfaces that are designed to accommodate the different usage patterns and types of users in an organization. The Insite interface is streamlined for use by anyone who wants to manage content and page layout from within the actual pages of the site. In this WYSIWYG environment, users can add/remove/re-sequence content on a page, preview content changes, drag and drop page components, and invoke workflow processes. The Dash interface is designed for the frequent business user who wants streamlined access to the content tasks performed every day. Users can create, compare and translate content for different locales and languages. Enhanced full-text search across all types of content makes it easier to find content that already exists. The Advanced interface for power users and developers should be familiar to anyone who has used the software application. Content Server 7 includes support for tags. Tags are user defined, so each user can choose tags that make sense to them. The same tag can be applied to different types of content, allowing you to group content any way you want, such as by campaign, project, or day of the week. Multilingual support is built directly into the data model, so that any content can be translated into multiple languages. Content Server 7 now has enhanced full-text search. FatWire Content Server 7 is available immediately. http://www.fatwire.com

A Kodak Moment

Last month, I had the privilege of being a guest lecturer at MIT for Howard Anderson and Peter Kurzina’s Course entitled “Managing in Crisis”. I prepared a case study about the current status of the College Publishing market. It included concerns about price pressures, used book competition, channel issues, new competitors and new media requirements, pending legislation about pricing practices, and the continued lack in the growth of unit sales.

The class was composed of 40 or so very bright students who did an excellent job analyzing the case. One student really captured the essence of the case when she said that it reminded her of the photography industry as the major players struggle with the transition from traditional film and paper products to digital photography.

While Kodak moments have long been associated with joyful celebrations, the aforementioned transition has been anything but a celebration. In fact, it is likely that this Kodak moment will come to exemplify the struggles of a powerful corporation as they strive and perhaps even fail due their inability to recognize the customer benefits, opportunities, and challenges associated with non-traditional types of photographic media.

Many publishers are struggling with their own “Kodak Moments”. And I think that the transition issues are similar to the photography industry. First, Kodak seems to have been focused on the products that people had been buying for years and trying to preserve the advantages that their film and paper technologies provided. Like many of us who have had market leading products, they were arrogant about their technology, processes, market position and quality. While they clearly were aware of digital photography technology, they dismissed digital products because the image quality was significantly poorer. Then they concentrated on turning digital photos into traditional photos. It seems to met hat they missed the potential in offering less expensive digital images that could easily be posted on the Internet or e-mailed to relatives.

For many years, publishers offerings have been closely related to technological developments in the Software and Printing industries. For example, software has enabled improvements in authoring, composition, and thereby lowered the costs of elegant or complex page designs. Printing Technology has made four color printing much more affordable and had made shorter print runs economical. These changes have been passed along to consumers of information. (While I understand that there are differences between the terms: content, information, and intellectual property, I will use the term information to subsume all three terms for purposes of brevity) In many cases, they have added value to the customer’s experience but there are cases where formats were enhanced and color offered because they could be rather than because they were beneficial. In reality, I believe that the net result was that the size, format, and frequency of the traditional economical delivery unit EDU (my term) of information (a book, journal, or magazine) were modified by technology advances but the traditional media form has remained essentially the same for 100+ years.

The Internet has presented publishers with a radical paradigm shift ( I don’t like the term either). All types of publishing entities have had to deal with changes in customer expectations that are easily as profound as those experienced by the Photography industry. They don’t just want their information to be more timely and less expensive, they also want their information to concisely answer their questions and seamlessly integrate with their work flows or learning styles.

Perhaps the most significant change is the redefinition of the EDU. In the purely print era, there needed to be a certain mass of information to build a product that would be economical to print and sufficiently valuable to consumers to generate a profit. In manycases, it was assumed that relatively few information consumers would use all of the information that was presented in a single EDU. Rather, the scope of the information (or content) had to be broad enough to attract enough customers without being so broad as to make customers feel that they were procuring too much information that wasn’t pertinent to their interests. Hence, we have witnessed a generation of books where authors and market researchers work closely together.

In the digital world, authors and publishers are potentially freed from the strictures of printing economies. Therefore, information currently found in textbooks, references, magazines and journals can be rendered as as short information objects or more comprehensive content modules. Or publishers can produce information objects or content modules that are not anticipated to ever take book form. The objects can be delivered in many ways including search engines such as Google. These new EDUs can be purchased or licenses separately or mixed and matched to create a course of instruction or a personal reference work. One benefit of these nimbler EDUs is that they blend nicely with software to offer increased value in the form of better instruction or more productive work flows.

The availability of these more compact EDUs will likely spawn many debates concerning academic traditions and learning methodologies that we have come to hold dear. It has long been the practice for students to read and master significant quantities of information with the expectation that many of the specific facts will fade from memory leaving a general understanding of the topic. And many people are considered well read because they have plowed through many traditional EDUs (Books) The question will be: Could one become well educated or well read by learning to explore topics of interest through smaller EDUs and/or what blend of contextual and specific information delivers the best and most productive intellectual outcome. There will be some interesting face-offs between technology enabled active exploration and discovery of information to allow students to pursue topics that they find interesting vs. the more structured mastery of a set of information presented in book form. Of course, it is not an either/or proposition as they methods must eventually be blended to enable meaningful knowledge acquisition.

The digital world has also created a demand for information that is developed and delivered as rapidly as possible. Where traditional publishers often justified their value by guaranteeing the accuracy and authority of their published information, many of today’s information consumers are willing to trade authority for velocity of information and now rely upon other information consumers to tell them what information is the most accurate and useful. Individuals now actively participate in communities that generate and evaluate large quantities of information objects and content modules. The Wikipedia/media organization and the MERLOT community are excellent examples of communities that produce quality information modules.

While many information consumers may still prefer to consume their information in print form, they may now wish to print their own copies or to create and purchase custom versions produced by rapidly improving print on demand technology. To many publishers, the perfectly formatted page has become almost an art form. They consider those pages to have many of the same aesthetic values that Kodak attributed to images produced via their traditional film technologies. Because customers rarely have the choice of formats, it is difficult to gauge the value that they derive from “perfect pages” vs potentially less expensive simpler pages. Chip Pettibone of O’Reilly Publishing reported at the recent Gilbane Conference that when readers of e-books were offered the choice between a simple HTML design and a faithful rendition of the original book page, 50% chose the HTML version and that population seemed to be gaining in numbers. Because books and computer screens represent quite different form factors, the value of the perfect page can actually limit rather that enhance the effective presentation of information in digital formats. Therefore, rather than trying to maintain the integrity of the printed page, modern publishers are designing their content to be presented equally well in a variety of media forms. Publishers that cling to the page metaphor are putting their futures in jeopardy.

This paradigm shift is replete with challenges and opportunities. Many traditional reference products (including Microsoft’s Encarta) have been decimated by new products created in the Internet Era. Newspapers and magazines have had to adapt to the challenges or multiple media environments by creating online products as they have seen their traditional readership dwindle. Journal publishers have had to derive new models to serve their subscriber base. Many categories of trade books now include websites with fancy multimedia elements and discussion groups. I think that some of the most exciting and interesting challenges and opportunities will be found in the world of educational publishing. As witnessed by the decisions of major publishing conglomerates to divest their educational publishing operations, the challenges of mastering the Internet paradigm shift are both daunting and expensive. To succeed, new generations of products will need to be built to take advantage of technology as opposed to being web versions of existing products. Business models will need to be revised and channel strategies re-engineered. One important outcome will be increased information accessibility for readers and learners with disabilities.

Over the next few years, the publishing industry will witness many Kodak moments…. Hopefully the majority will be the old fashioned Kodak moments of victory and celebration.

Sitecore Incorporates Clay Tablet Connectivity for Native Translation Workflow

Web content management vendor Sitecore has incorporated support for translation middleware from Clay Tablet, allowing Sitecore’s customers to professionally translate their web sites into any number of languages, and keep them updated as content changes. Clay Tablet’s software manages the flow of content between Sitecore CMS and language service providers, making it quicker to offer Sitecore-based sites in multiple languages. Administrators can manage their site in one language, and content in other languages is kept current automatically. Clay Tablet’s software lets companies automate the flow of text between content management and translation systems. When translated content returns from a translation service provider, it’s routed back to the correct destination. Capable of connecting any content storage system to any globalization system, Clay Tablet helps customers integrate and manage the diverse content storage and authoring systems that may be used across an organization, and simplifies localization. http://www.Clay-Tablet.com, http://www.sitecore.net

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑