The Gilbane Advisor

Curated for content, computing, and digital experience professionals

Page 259 of 918

Webinar: Making the Business Case for SaaS WCM

Updated April 9, 2009: View the recorded webinar.
January 27, 2009, 2:00 pm ET

When customer experience becomes increasingly important even as budgets are tightening, the SaaS value proposition–faster time to results, reduced dependency on IT resources, predictable costs–can be especially compelling. If your organization wants or needs to move ahead with web business initiatives in today’s uncertain economic climate, you’re probably investigating software-as-a-service solutions for web content management.

But SaaS WCM is fundamentally different from licensing software (open source or proprietary) and installing it on your own servers. Which means the process of evaluating solutions is different. It’s not all apples when SaaS is on the short list, but rather apples and oranges.This webinar explores the implications for technology acquisition. How do you make a business case that enables your organization to fairly evaluate all options and make the best decision for the business?

Join us in a lively discussion with Robert Carroll from Clickability. Register today. Presented by Gilbane. Sponsored by Clickability. Based on a new Gilbane Beacon entitled Communicating SaaS WCM Value.

Open Government Initiatives will Boost Standards

Following on Dale’s inauguration day post, Will XML Help this President?,  we have today’s invigorating news that President Obama is committed to more Internet-based openness. The CNET article highlights some of the most compelling items from the two memoes, but I am especially heartened by this statement from the memo on the Freedom of Information Act (FOIA):

I also direct the Director of the Office of Management and Budget to update guidance to the agencies to increase and improve information dissemination to the public, including through the use of new technologies, and to publish such guidance in the Federal Register.

The key phrases are "increase and improve information dissemination" and "the use of new technologies." This is keeping in spirit with the FOIA–the presumption is that information (and content) created by or on behalf of the government is public property and should be accessible to the public.  This means that the average person should be able to easily find government content and be able to readily consume it–two challenges that the content technology industry grapples with every day.

The issue of public access is in fact closely related to the issue of long-term archiving of content and information. One of the reasons I have always been comfortable recommending XML and other standards-based technology for content storage is that the content and data would outlast any particular software system or application. As the government looks to make government more open, they should and likely will look at standards-based approaches to information and content access.

Such efforts will include core infrastructure, including servers and storage, but also a wide array of supporting hardware and software falling into three general categories:

  • Hardware and software to support the collection of digital material. This ranges from hardware and software for digitizing and converting analog materials, software for cataloging digital materials with the inclusion of metadata, hardware and software to support data repositories, and software for indexing the digital text and metadata.
  • Hardware and software to support the access to digital material. This includes access tools such as search engines, portals, catalogs, and finding aids, as well as delivery tools allowing users to download and view textual, image-based, multimedia, and cartographic data.
  • Core software for functions such as authentication and authorization, name administration, and name resolution.

Standards such as PDF-A have emerged to give governments a ready format for long-term archiving of routine government documents. But a collection of PDF/A documents does not in and of itself equal a useful government portal. There are many other issues of navigation, search, metadata, and context left unaddressed. This is true even before you consider the wide range of content produced by the government–pictorial, audio, video, and cartographic data are obvious–but also the wide range of primary source material that comes out of areas such as medical research, energy development, public transportation, and natural resource planning.

President Obama’s directives should lead to interesting and exciting work for content technology professionals in the government. We look forward to hearing more.

Taxonomy and Glossaries for Enterprise Search Terminology

Two years ago when I began blogging for the Gilbane Group on enterprise search, the extent of my vision was reflected in the blog categories I defined and expected to populate with content over time. They represented my personal “top terms” that were expected to each have meaningful entries to educate and illuminate what readers might want to know about search behind the firewall of enterprises.

A recent examination of those early decisions showed me where there are gaps in content, perhaps reflecting that some of those topics were:

  • Not so important
  • Not currently in my thinking about the industry
  • OR Not well defined

I also know that on several occasions I couldn’t find a good category in my list for a blog I had just written. Being a former indexer and heavy user of controlled vocabularies, on most occasions I resisted the urge to create a new category and found instead the “best fit” for my entry. I know that when the corpus of content or domain is small, too many categories are useless for the reader. But now, as I approach 100 entries, it is time to reconsider where I want to go with blogging about enterprise search.

In the short term, I am going to try to provide entries for scantily covered topics because I still think they are all relevant. I’ll probably add a few more along the way or perhaps make some topics a little more granular.

Taxonomies are never static, and require periodic review, even when the amount of content is small. Taxonomists need to keep pace with current use of terminology and target audience interests. New jargon creeps in although I prefer to use generic and terms broadly understood in the technology and business world.

That gives you an idea of some of my own taxonomy process. To add to the entries on terminology (definitions) and taxonomies, I am posting a glossary I wrote for last year’s report on the enterprise search market and recently updated for the Gilbane Workshop on taxonomies. While the definitions were all crafted by me, they are validated through the heavy use of the Google “define” feature. If you aren’t already a user, you will find it highly useful when trying to pin down a definition. At the Google search box, simply type define: xxx xxx (where xxx represents a word or phrase for which you seek a definition). Google returns all the public definition entries it finds on the Internet. My definitions are then refined based on what I learn from a variety of sources I discover using this technique. It’s a great way to build your knowledge-base and discover new meanings.

Glossary Taxonomy and Search-012009

Ephox Brings Online Content Authoring to IBM Lotus Quickr

Ephox announced the release of a new integration of its rich text editor into IBM’s Lotus Quickr. Lotus Quickr is team collaboration software that helps enterprises share content, collaborate and work faster online. It goes beyond traditional document sharing by adding web-based collaboration with wikis, blogs and team web pages. EditLive! addresses this need with a Word-like authoring experience. It also offers capabilities for image editing, track changes, table editing and accessibility checking. EditLive! integrations for the WebSphere Portal and the Lotus Domino versions of Quickr are available now. The integration for the Quickr Domino platform was co-developed by the PSC Group, a Lotus consulting firm. http://www.ephox.com

Mark Logic Corporation Releases MarkLogic Toolkit for Word

Mark Logic Corporation announced the MarkLogic Toolkit for Word. Distributed under the open-source Apache 2.0 license, the MarkLogic Toolkit for Word delivers a free, simple way for developers to combine native XML-based functionality in both MarkLogic Server and the most common content authoring environment, Microsoft Office Word 2007. Developers can build applications for finding and reusing enterprise content, enriching documents for search and analytics, and enhancing documents with custom metadata. The MarkLogic Toolkit for Word includes a pre-built plug-in framework for Microsoft Office Word 2007, a sample application, and an extensive library for managing and manipulating Microsoft Office Word 2007 documents. Intelligent Authoring – the MarkLogic Toolkit for Word provides the ability to build a role- and task-aware application within Microsoft Office Word 2007 to improve the content authoring process. This functionality allows users to easily locate and preview content at any level of granularity and insert it into an active document, as well as manage custom document metadata. The MarkLogic Toolkit for Word allows developers to build content applications that leverage Office Open XML, the native XML-based format of Microsoft Office Word 2007. The MarkLogic Toolkit for Word includes an add-in application for deploying web-based content applications into Microsoft Office Word 2007. This enables developers to use web development techniques, such as HTML, JavaScript, and .NET to build applications that work in concert with the Microsoft Office Word 2007 authoring environment. The MarkLogic Toolkit for Word also provides XQuery libraries that simplify working with Office Open XML for granular search, dynamic assembly, transformation, and delivery with MarkLogic Server. By leveraging the underlying XML markup, content applications built with MarkLogic and Microsoft Office Word 2007 can “round-trip” documents between various formats. The MarkLogic Toolkit for Word allows developers to inspect, modify, and even redistribute the source code to meet specific needs. You can download the latest release of MarkLogic Toolkit for Word at the Mark Logic Developer Workshop. http://www.marklogic.com

Adobe Launches Technical Communication Suite 2

Adobe Systems Incorporated (Nasdaq:ADBE) announced the Adobe Technical Communication Suite 2 software, an upgrade of its solution for authoring, reviewing, managing, and publishing rich technical information and training content across multiple channels. Using the suite, technical communicators can create documentation, training materials and Web-enabled user assistance containing both traditional text and 3D designs along with rich media, including Adobe Flash Player compatible video, AVI, MP3 and SWF file support. The enhanced suite includes Adobe FrameMaker 9, the latest version of Adobe’s technical authoring and DITA publishing solution, Adobe RoboHelp 8, a major upgrade to Adobe’s help system and knowledge base authoring tool, Adobe Captivate 4, an upgrade to Adobe’s eLearning authoring tool, and Photoshop CS4, a new addition to the suite. The suite also includes Adobe Acrobat 9 Pro Extended and Adobe Presenter 7. Adobe Technical Communication Suite 2 is a complete solution that offers improved productivity along with support for standards-based authoring including support for Darwin Information Typing Architecture (DITA), an XML-based standard for authoring, producing and delivering technical information. It enables the creation of rich content and publishing through multiple channels, including XML/HTML, print, PDF, WSF, WebHelp, Adobe FlashHelp, Microsoft HTML Help, OracleHelp, JavaHelp and Adobe AIR. FrameMaker 9 offers a new user interface. It supports hierarchical books and DITA 1.1, and makes it easier to author topic-based content. In addition, FrameMaker 9 provides a capability to aggregate unstructured, structured and DITA content in a seamless workflow. Using a PDF based review workflow, authors can import and incorporate feedback. Adobe RoboHelp 8 allows technical communicators to author XHTML-compliant professional help content. The software also supports Lists and Tables, a new CSS editor, Pages and Templates, and a new search functionality. The Adobe Technical Communication Suite 2 is immediately available in North America. Estimated street price for the suite is US$1899. FrameMaker 9, RoboHelp 8 and Captivate 4 are available as standalone products as well. Estimated street price for FrameMaker 9 and RoboHelp 8 is US$999 for each, US$799 for Captivate 4. http://www.adobe.com

Will XML Help this President?

I’m watching the inauguration activity today all day (not getting much work done) and getting caught up in the optimism and history of it all. And what does this have to do with XML you ask? It’s a stretch, but I am giddy from the festivities, so bare with me please. I think there is a big role for XML and structured technologies in this paradigm shift, albeit XML will be quietly doing it’s thing in the background as always.

In 1986, when SGML, XML’s precursor, was being developed, I worked for the IRS in Washington. I was green, right out of college. My Boss, Bill Davis, said I should look into this SGML stuff. I did. I was hooked. It made sense. We could streamline the text applications we were developing. I helped write the first DTD in the executive branch (the first real government one was the ATOS DTD from the US Air Force, but that was developed slightly before the SGML standard was confirmed, so we always felt we were pretty close to creating the actual first official DTD in the federal government). Back then we were sending tax publications and instructions to services like CompuServe and BRS, each with their own data formats. We decided to try to adopt structured text technology and single source publishing to make data available in SGML to multiple distribution channels. And this was before the Web.  That specific system has surely been replaced, but it saved time and enabled us to improve our service to taxpayers. We thought the approach was right for many govenrment applications  and should be repeated by other agencies.

So, back to my original point. XML has replaced SGML and is now being used for many government systems including electronic submission of SEC filings, FDA applications, and for the management of many government records. XML has been mentioned as a key technology in the overhaul that is needed in the way the government operates. Obama also plans to create a cabinet level position of CTO, part of the mission of which will be to promote inter-agency cooperation through interchange of content and data between applications formatted in a common taxonomy. He also intends to preserve the open nature of the internet and its content, facilitate publishing important government information and activities on the Web in open formats, and to enhance the national information system infrastructure. Important records are being considered for standardization, such as health and medical records, as well as many other ways we interact with the government. More info on this administration’s technology plan can be found at . Sounds like a job, at least in part, for XML!

I think it is great and essential that our leaders understand the importance of smartly structured data. There is already a lot of XML expertise through the various government offices, as well as a strong spirit of corporation on which we can build. Anyone who has participated in industry schema application development, or other common vocabulary design efforts, knows how hard it is to create a “one-size-fits-all” data model. I was fortunate enough to participate briefly in the development and implementation of SPL, the Standard Product Label (see http://www.fda.gov/oc/datacouncil/spl.html) schema for FDA drug labels which are submitted to the FDA for approval before the drug product can be sold. This is a very well defined document type that has been in use for years. It still took many months and masterful consensus building to finalize this one schema. And it is just one small piece in the much larger information architecture.  It was a lot of effort from many people within and outside the government.  But now it is in place, working and being used.

So, I am bullish on XML in the government these days. It is a mature, well understood, powerful technology with wide adoption, there are many established civilian and defense  examples across the government. I think there is a very big role for XML and related technology in the aggressive, sweeping change promised by this administration. Even so, these things take time. </>

New Study on Social Media Adoption by Higher Education

“Social Media and College Admissions: The First Longitudinal Study” conducted by Dr. Nora Ganim Barnes, Ph.D., Senior Fellow and Research Chair of the Society for New Communications Research and Chancellor Professor of Marketing at the University of Massachusetts Dartmouth and Eric Mattson, CEO of Financial Insite Inc. was announced. The new study represents one of the first statistically significant, longitudinal studies on the usage of social media by college admissions offices. The study compares adoption of social media between 2007 and 2008 by the admissions offices of all the four-year accredited institutions in the United States. The findings are based on 536 interviews with college admissions officers. Key Findings include: There has been significant growth in familiarity with, adoption of, and importance to mission of social media over one year ago; Adoption has grown by 24% in one year: 61% in 2007 as compared with 85% in 2008. Usage increased for every social media type studied; Adoption is being driven by admissions departments’ recognition of the increasingly importance of social media. Colleges and universities are outpacing U.S. corporate adoption of social media tools and technologies (13% of the Fortune 500 and 39% of the Inc. 500 currently have a public blog, while 41% of college admissions departments have blogs); Social networking is the tool most familiar to admissions officers, with 55% of respondents claiming to be “very familiar with it” in the first study and 63% in 2008; A growing number of admissions officers use search engines (23%) and social networks (17%) to research prospective students; In addition to social networks, usage of YouTube has also increased substantially. Video is now being used to deliver virtual tours of campuses, virtual visits to the dorms, and sample lectures from the faculty; 78% of private schools have blogs, versus 28% of public schools, and 50% of schools with undergraduate populations of less than 2,000 have blogs; 40% of institutions not currently using social media plan to start a blog; Nearly 90% of admissions departments feel that social media is “somewhat to very important” to their future strategy. The full executive summary of the study is available for download at:
http://www.umassd.edu/cmr/studiesresearch/mediaandadmissions.cfm, http://www.sncr.org

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑