Are you investigating technology for protecting your company’s high-value documents and other intellectual property? Is better content security on your company’s plate for 2008? Need to know the current state-of-the-art regarding enterprise rights management?
Gilbane Group is conducting a survey of companies that are investigating, adopting, and using rights management solutions for high-value enterprise content (contracts, HR policies, product strategies, regulatory compliance certifications, and so on). The results will be included in our upcoming study on Enterprise Rights Management: Business Imperatives and Implementation Readiness.
We are seeking input from IT, content management, and IT security professionals across multiple industries (excluding consumer media companies, which are outside the scope of this study). Some familiarity with enterprise rights management (ERM) or information rights management (IRM) is necessary (i.e., respondents need to have at least heard of the term).
The survey is online and takes about fifteen minutes to complete. In exchange for participation, qualified respondents will receive the aggregated survey results and the executive summary of the analysis. Respondents who fill out the survey in full and provide a valid email business address are also entered into a random drawing for a free one-hour phone consultation with the Gilbane ERM analyst team. Take the survey now. Contact us if you have any questions about the research or qualifications to take the survey.
Category: Publishing & media (Page 36 of 53)
Steve Paxhia noted during a meeting the other day that the kindle is indeed still on back order, though as far as we know there are no definitive numbers out on how many they have actually sold. Still, unless there are extraordinarily problems with their manufacturing or supply chain, they have to be producing and selling a healthy number of them. In another meeting last week, someone actually said, “I will read that on my Kindle on the flight back.”
Then today, via Slashdot I learn that science fiction publisher Tor is giving away free eBooks in association with the launch of their new website. Science fiction is another market, along with romance, that has been good for eBooks, and this kind of wide-scale marketing strikes me as a logical next step.
UPDATE: Evan Schnittman of Oxford University Press is making maximum use of his Kindle and thinks it beats the SkyMall catalog any day.
- Over at eWeek, Jim Rapoza looks at the most overhyped technologies of the century, and XML isn’t one of them.
- At IBM developerWorks, Elliotte Rusty Harold speculates on the future of XML. He’s bullish on XQuery and Atom, and he declares the end of markup-centric editors.
- Speaking of being bullish on Atom, check out Mochilla’s Atom-based API for premium content.
- Geoff Bock sends along news that Microsoft’s push to get OOXML as a standard is being scrutinized by the EU.
- Also on the OOXML front, IBM and Microsoft seem ready to go toe to toe. More perspective here and here.
- Have you ever thought you should be able to take DITA-encoded content and pump it through InDesign? You are not alone.
- If you follow the Apache Software Foundation or other technical listservs at any level of interest, you just have to try Mark Logic’s MarkMail application where you can ask questions like, “Who from Microsoft chimes in on the XML schema list at the W3C?“.
- I’m not the only one to think that part of Microsoft’s interest in Yahoo is driven by Yahoo’s impressive efforts in wireless technology, which have XML at their core.
JustSystems, Inc. announced the availability of the “DITA Maturity Model,” which was co-authored with IBM and defines a graduated, step-by-step methodology for implementing Darwin Information Typing Architecture (DITA). One of DITA’s features is its support for incremental adoption. Users can start with DITA using a subset of its capabilities, and then add investment over time as their content strategy evolves and expands to cover more requirements and content areas. However, this continuum of adoption has also resulted in confusion, as communities at different stages of adoption claim radically different numbers for cost of migration and return on investment.
The DITA Maturity Model addresses this confusion by dividing DITA adoption into six levels, each with its own required investment and associated return on investment. Users can assess their own capabilities and goals relative to the model and choose the initial adoption level appropriate for their needs and schedule. The six levels of DITA adoption include:
Level 1: Topics – The most minimum DITA adoption requires the migration of the current XML content sources;
Level 2: Scalable Reuse – The major activity at this level is to break down the content in topics that are stored as individual files and use DITA maps to collect and organize the content into reusable units for assembly into specific deliverables;
Level 3: Specialization and Customization – Now, users expand the information architecture to be a full content model, which explicitly defines the different types of content required to meet different author and audience needs and specify how to meet these needs using structured, typed content;
Level 4: Automation and Integration – Once content is specialized, users can leverage their investments in semantics with automation of key processes and begin tying content together even across different specializations or authoring disciplines;
Level 5: Semantic Bandwidth – As DITA diversifies to occupy more roles within an organization, a cross-application, cross-silo solution that shares DITA as a common semantic currency lets groups use the toolset most appropriate for their content authoring and management needs;
Level 6: Universal Semantic Ecosystem – As DITA provides for scalable semantic bandwidth across content silos and applications, a new kind of semantic ecosystem emerges: Semantics that can move with content across old boundaries, wrap unstructured content, and provide validated integration with semi-structured content and managed data sources. http://www.ibm.com, http://na.justsystems.com
ADAM Software announced full breadth XPS-functionality. XPS stands for XML Paper Specification. ADAM’s provider model allows third party developers to co-engineer on emerging opportunities. As for XPS, ADAM software joined forces with the Belgian company NiXPS to build an XPS-engine for ADAM. The ‘NiXPS Library v2.0’ widens the scope in which ADAM can handle data. Thumbnails of XPS files are shown in ADAM, the previewing of XPS files starts here. Metadata can be read by ADAM when importing. ADAM handles a conversion of XPS files to Adobe PDF. http://www.adam.be, http://www.nixps.com
As Frank noted in our main blog and in the related press release, this blog is part of our launch this week of a new practice focused on the technologies, strategies, and best practices associated with using XML in content management. With this focus on XML, the new practice is broad–XML is fundamental to so many aspects of content management. Yet the focus on XML also compels us to look at content management through a certain lens. This begins with the vendor offerings, where nearly every platform, product, and tool has to meet anywhere from a few to myriad XML-related requirements. As XML and its related standards have evolved and matured, evaluating this support has become a more complex and considered task. The more complex and feature-rich the offering, the more difficult the task of evaluating its support.
And indeed, the offerings are becoming more complex, especially among platform vendors like Microsoft, IBM, and Oracle. Looking at SharePoint means evaluating it as a content management platform, but also looking specifically at how it supports technologies like XML forms interfaces, XML data and content feeds, and integration with the XML schemas underlying Microsoft Word and Excel. It also means looking at SOA interfaces and XML integration of Web Parts,and considering how developers and data analysts might want to utilize XML schema and XSLT in SharePoint application development. Depending on your requirements and applications, there could be a great deal more functionality for you to evaluate and explore. And that is just one platform.
But understanding the vendor–and open source–offerings is only one piece of the XML content management puzzle. Just as important as choosing the right tools are the strategic issues in planning for and later deploying these offerings. Organizations often don’t spend enough time asking and answering the biggest and most important questions. What goals do they have for the technology? Cost savings? Revenue growth? Accelerated time to market? The ability to work globally? These general business requirements need to then be translated into more specific requirements, and only then do these requirements begin to point to specific technologies. If XML is part of the potential solution, organizations need to look at what standards might be a fit. If you produce product support content, perhaps DITA is a fit for you. If you are a publisher, you might look at XML-based metadata standards like XMP or PRISM.
Finally, XML doesn’t exist in a content management vacuum, removed from the larger technology infrastructure that organizations have put in place. The platforms and tools must integrate well with technologies inside and outside the firewall; this is especially true as more software development is happening in the cloud and organizations are more readily embracing Software as a Service. One thing we have learned over the years is that XML is fundamental to two critical aspects of content management—for the encoding and management of the content itself (including the related metadata) and for the integration of the many component and related technologies that comprise and are related to content management. Lauren Wood wrote about this in 2002, David Guenette and I revisited it a year later, and the theme recurs in numerous Gilbane writings. The ubiquitous nature of XML makes the need for strategies and best practices more acute, and also points to the need to bring together the various stakeholders–notably the business people who have the content management requirements and the technologists who can help make the technology adoptions successful. Projects have the best chance of succeeding when these stakeholders are brought together to reach consensus first on business and technical requirements, and, later, to reach consensus on technology and approach.
As Frank noted, this is “New/Old” news for all of us involved with the new practice. I first discussed SGML with Frank in 1987 when I was at Mitre and responsible for a project to bring new technology to bear on creating specifications for government projects. Frank had recently launched his technology practice, Publishing Technology Management. Leonor was a client at Factory Mutual when I worked for Xyvision (now XyEnterprise) in the early 1990s. And I probably first met Mary at a GCA (now IDEAlliance) event during my Xyvision days and when she worked for a competitor, Datalogics. We are, in the polite vernacular of the day, seasoned professionals.
So welcome to the new blog. Watch this space for more details as we announce some of the offerings and initiatives. I plan to blog actively here, so please add the RSS feed if you prefer to digest your material that way. If you have ideas or suggestions, don’t hesitate to post here or contact me or any of the other analysts directly. We look forward to the interaction!
MadCap Software announced MadCap Lingo, an XML-based, integrated translation memory system and authoring tool, aimed at eliminating the need for file transfers in order to complete translation. Document components, such as tables of content, topics, index keywords, concepts, glossaries, and variables all remain intact throughout the translation and localization process, so there is never a need to recreate them. MadCap Lingo also is integrated with MadCap Flare and MadCap Blaze, and it is Unicode enabled to help documentation professionals deliver a consistent user experience in print, online, and in any language. MadCap Lingo is being announced in conjunction with the new MadCap Analyzer, software that proactively recommends documentation content and design improvements. MadCap Lingo works with MadCap Flare, the company’s native-XML authoring product, and MadCap Blaze, the native-XML tool for publishing long print documents, which will be generally available in early 2008. A user creates a MadCap Lingo project to access the source content in a Flare or Blaze project via a shared file structure. Working through Lingo’s interface, the user accesses and translates the content. Because the content never actually leaves the structure of the original Flare or Blaze project, all the content and formatting is preserved in the translated version. Once a project is translated, it is opened in either Flare or Blaze, which generates the output and facilitates publishing. At the front end of the process, Flare and Blaze can import a range of document types to create the source content. Following translation, the products provide single-source delivery to multiple formats online and off, including the Internet, intranets, CDs, and print. MadCap Lingo is available and is priced at $2,199 per license, but is available at an introductory price of $899 for a limited time. MadCap Lingo also is available on a subscription basis for $649 per year. Fees for support start at $449 per year. http://www.madcapsoftware.com/
SiberLogic announced SiberSafe On-Demand, a monthly subscription approach to XML content management for technical documentation teams who are looking for significant efficiency gains in producing long-lived, complex, evolving content. SiberSafe On-Demand delivers full SiberSafe functionality as an ASP service in a secure data center. Each team has full access/administrative rights to their server for system administration and configuration. SiberSafe On-Demand also includes daily content backups and SiberLogic’s technical support service. SiberSafe On-Demand “out of the box” configuration offers your choice of DTD – DITA, DocBook, or MIL-STD 2361 – with sample templates and stylesheets. Also included are SiberSafe Communicator (our XML authoring tool) and our integrated publishing tool. Alternatively, you can continue to use your own editor, such as XMetaL, Epic, or FrameMaker, or your own publishing tools. SiberSafe On-Demand costs only $799 per month for the first pair of users (one author and one reviewer) and as little as $275 per user monthly for 10+ users. There are no additional upfront costs. Anyone who signs up for SiberSafe On-Demand before the end of January 2008 will receive access for one additional author free of charge for the first year. http://www.siberlogic.com/