Curated for content, computing, and digital experience professionals

Month: July 2009 (Page 1 of 3)

Searching Email in the Enterprise

Last week I wrote about “personalized search” and then a chance encounter at a meeting triggered a new awareness of business behavior that makes my own personalized search a lot different than might work for others. A fellow introduced himself to me as the founder of a start-up with a product for searching email. He explained that countless nuggets of valuable information reside in email and will never be found without a product like the one his company had developed. I asked if it only retrieved emails that were resident in an email application like Outlook; he looked confused and said “yes.” I commented that I leave very little content in my email application but instead save anything with information of value in the appropriate file folders with other documents of different formats on the same topic. If an attachment is substantive, I may create a record with more metadata in my content management database so that I can use the application search engine to find information germane to projects I work on. He walked away with no comment, so I have no idea what he was thinking.

It did start me thinking about the realities of how individuals dispose of, store, categorize and manage their work related documents. My own process goes like this. My work content falls into four broad categories: products and vendors, client organizations and business contacts, topics of interest, and local infrastructure related materials. When material is not purposed for a particular project or client but may be useful for a future activity, it gets a metadata record in the database and is hyperlinked to the full-text. The same goes for useful content out on the Web.

When it comes to email, I discipline myself to dispose of all email into its appropriate folder as soon as I can. Sometimes this involves two emails, the original and my response. When the format is important I save it in the *.mht format (it used to be *.htm until I switched to Office 2007 and realized that doing so created a folder for every file saved); otherwise, I save content in *.txt format. I rename every email to include a meaningful description including topic, sender and date so that I can identify the appropriate email when viewing a folder. If there is an attachment it also gets an appropriate title and date, is stored in its native format and the associated email has “cover” in the file name; this helps associate the email and attachment. The only email that is saved in Outlook in personal folders is current activity where lots of back and forth is likely to occur until a project is concluded. Then it gets disposed of by deleting, or with the project file folders as described above. This is personal governance that takes work. Sometimes I hit a wall and fall behind on the filtering and disposing but I keep at it because it pays off in the long term.

So, why not relax and leave it all in Outlook, then let a search engine do the retrieval? Experience had revealed that most emails are labeled so poorly by senders and the content is so cryptic that to expect a search engine to retrieve it in a particular context or with the correct relevance would be impossible. I know this from the experience of having to preview dozens of emails stored in folders for projects that are active. I have decided to give myself the peace of mind that when the crunch is on, and I really need to go to that vendor file and retrieve what they sent me in March of last year, I can get it quickly in a way that no search engine could ever do. Do you realize how much correspondence you receive from business contacts using their “gmail” account with no contact information revealing their organization in the body and signed with a nickname like “Bob” and messages “like we’re releasing the new version in four weeks” or that just have a link to an important article on the web with “thought this would interest you?”

I did not have a chance to learn if my new business acquaintance had any sense of the amount of competition he has out there for email search, or what his differentiator is that makes a compelling case for a search product that only searches through email, or what happens to his product when Microsoft finally gets FAST search bundled to work with all Office products. OR, perhaps the rest of the world is storing all content in Outlook. Is this true? If so, he may have a winner.

Gilbane Boston Speaking Proposal Update

We are still working on the program for this year’s Boston conference, December 1-3, and Sarah has left us for graduate school. Fortunately, we have a great new Marketing Coordinator, Scott Templeman, who will be communicating with all of you who have submitted proposals. You can reach Scott at 617-497-9443 ext 156 or at with any questions about the status of your proposals, but official confirmations are still a week or two away.

Gilbane Group Releases New Study on Multilingual Product Content

For Immediate Release

Pioneering Research Describes Transformation of Technical Communications Practices to Align More Closely With Global Business Objectives

Cambridge, MA, July 28 — Gilbane Group, Inc., the analyst and consulting firm focused on content technologies and their application to high-value business solutions, today announced the publication of its latest research, Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains.

The report is backed by in-depth qualitative research on how global businesses are creating, managing, and publishing multilingual product content. The study extends Gilbane’s 2008 research on multilingual business communications with a close look at the strategies, practices, and infrastructures specific to product content.

The research clearly shows a pervasive enterprise requirement for product content initiatives to tangibly improve global customer experience. Respondents from a mix of technical documentation, customer support, localization/translation, and training departments indicate that “global-ready technology architectures” are the second most often cited ROI factor to meet the directive. All respondents view single-sourcing strategies and self-help customer support applications as the two most important initiatives to align product content with global business objectives.

“Successful business cases for product content globalization address top-line issues relevant to corporate business goals while tackling bottom-line process improvements that will deliver cost savings,” commented Leonor Ciarlone, Senior Analyst, Gilbane Group, and program lead for Multilingual Product Content. “Our research shows that while multilingual content technologies are clearly ROI enablers, other factors influence sustainable results. Cross-departmental collaboration and overarching business processes, cited as essential improvements by 70% and 82% of respondents respectively, are critical to transforming traditional practices.”

 Multilingual Product Content is the first substantive report on the state of end-to-end product content globalization practices from multiple perspectives. “Gilbane’s latest research continues to show both language and content professionals how the well-managed intersection of their domains is becoming best practice,” said Donna Parrish, Editor, MultiLingual magazine. “With practical insights and real experiences in the profiles, this study will serve as a valuable guide for organizations delivering technical documentation, training, and customer support in international markets.”

The report covers business and operational issues, including the evolving role of service providers as strategic partners; trends in authoring for quality at the source, content management and translation management integration, machine translation, and terminology management; and progress towards developing metrics for measuring the business impact of multilingual content. Profiles of leading practioners in high tech, manufacturing, automotive, and public sector/education are featured in the study.

Multilingual Product Content: Transforming Traditional Practices Into Global Content Value Chains is available as a free download from the Gilbane Group website at The report is also available from study sponsors Acrolinx, Jonckers, Lasselle-Ramsay, LinguaLinx, STAR Group, Systran, and Vasont Systems.

About Gilbane Group

Gilbane Group, Inc., is an analyst and consulting firm that has been writing and consulting about the strategic use of information technologies since 1987. We have helped organizations of all sizes from a wide variety of industries and governments. We work with the entire community of stakeholders including investors, enterprise buyers of IT, technology suppliers, and other analyst firms. We have organized over 70 educational conferences in North America and Europe. Our next event  if Gilbane Boston, 1-3 December 2009, Information about our newsletter, reports, white papers, case studies, and blogs is available at Follow Gilbane Group on Twitter at

Gilbane Group, Inc.
Ralph Marto, +1.617.497.0443 xt 117


SharePoint: Without the Headaches – A Discussion of What is Available in the Cloud

There are few people who have not heard of SharePoint, but understanding what SharePoint has to offer is another story.  The best way to understand SharePoint is to use it.  This series of posts will provide an overview of the product, and explains how a non techie can get started.

SharePoint is currently in its third incarnation (SharePoint 2007) and within 9 months Microsoft will be deploying the fourth version, “SharePoint 2010.”  There are three distinct SKUs:

  1. WSS (Windows SharePoint Server)
    – Comes with the Windows Server and is free.
  2. MOSS (Microsoft Office SharePoint Server) Standard Edition
    – An extension of WSS, and is licensed per server as well as per user.
  3. MOSS (Microsoft Office SharePoint Server) Enterprise Edition
    – An extension of the Standard Edition, and is licensed per server as well as per user.

It is also possible to buy a “Public Connector” for MOSS, which is a license  that allows SharePoint to be used as a publicly facing site with no limit on the number of users .

Although Microsoft is trying to showcase SharePoint as an excellent platform to build publicly facing sites, there is general agreement that SharePoint is best used in a closed community where users must login.  Microsoft touts SharePoint as a product that supports six pillars: (These pillars are about to be rebranded in SharePoint 2010, see SharePoint 2010 has new pillars.)  The six pillars are:

  1. Collaboration
    – Allowing members of a closed community to share documents, tasks, calendars, contacts, etc
  2. Portal
    – Providing a single web site that is the gateway to an organization’s web based functions.
  3. Enterprise Search
    – Competing with Google for the enterprise,
  4. Web & Enterprise Content Management
    – A publishing platform that allows for simple workflows among authors and editors.
  5. Forms Driven Business Process
    – Allows for easy development of electronic forms and associated automated workflows.
  6. Business Intelligence
    – Allows organization to build dashboards summarizing data that reside in disparate electronic repositories.

The original intent behind SharePoint was to empower business users to control their own destiny without being dependent on IT and Development staff.  In the author’s experience, SharePoint often requires much more planning and maintenance than business users can provide.  Thus one often finds that specially trained SharePoint IT and developer personnel are required to stand-up and support in-house SharePoint deployments.

Although still quite limited, it is now possible to lease robust versions of SharePoint that reside in the cloud and truly are managed without any hidden costs.  This series of articles will summarize three services that were tried by the author:

  1. SharePoint Online – Part of the Microsoft Business Online Productivity Suite.
  2. Apps4rent – A robust SharePoint and Exchange online implementation.
  3. WebHost4Life – Similar to  Apps4Rent’s SharePoint implementation with a non-Exchange email system.

The discussion will focus only on SharePoint.  In all cases, the environments are WSS (Not MOSS) and are hosted in a joint tenancy model, meaning that you are sharing computing resources with other SharePoint sites. Although people will tell you there could be a number of reasons why this may be problematic, the author never experienced any issues due to joint tenancy.  Microsoft does offer an expensive service in a dedicated environment.  This service requires that a minimum of 5,000 user licenses are being leased.

Both Apps4rent and WebHost4Life have a simple model that is easy for an end user to understand. In contrast, the Microsoft environment is quite confusing with poor documentation.  Both Apps4Rent and WebHost4Life offer immediate support with chat sessions, and the customer service staff was knowledgeable and helpful.  Again, in contrast to this, Microsoft’s support was poor.  Microsoft communicated via a secure email channel, responses took 4 to 6 hours, and the support personnel did not understand the product well…

Continue reading

Personalized Search in the Enterprise

This is an interesting topic for two reasons: there is enormous diversity in the ways we all think and go about finding content; personalizing a search interface without being intrusive is extremely difficult. Any technology that requires us to do activities according to someone else’s design, which bends our natural inclination, is by definition not going to be personal.

This topic comes to mind because of two unrelated pieces of content I read in the past 24 hours. The first was an email asking me about personal information management and automated tagging, and the second was an interview I read with Mike Moran, a thought leader in search and speaker at one of our Gilbane Conferences. In the interview, Mike talks about personalized search. Then Information Week referenced search personalization in an article about a patent suit against Google.

Here is my take on the many personalized search themes that have recently emerged. From dashboards to customizing results, options to focus on particular topics or types of content, socialized search to support interacting with and sharing results, to retrieving content we personally created or received (email), content we used or were named in, all might be referred to as search personalization. Getting each to work well will enhance enterprise search but….

Knowing how transient and transformative our thoughts and behaviors really are, we should focus realistically on the complexity of producing software tools and services that satisfy and enhance personal findability. We are ambiguous beings, seeking structured equilibrium in many of our activities to create efficiency and reduce anxiety, while desiring new, better, quicker and smarter devices to excite and engage us. Once we achieve a level of comfort with a method or mechanism, whether quickly or over time, we evolve and seek change. But, when change is imposed on an unprepared mind, our emotions probably override any real benefit that might be gained in productivity. Then we tend to self-sabotage the potential for operational usefulness when an uncomfortable process intrudes. Mental lack of preparedness undermines our work when a new design demands a behavioral shift that lacks connection to our current state or past experiences. How often are we just not in a frame of mind to take on something totally alien, especially with deadlines looming?

Look at the single most successful aspect of Google, minimalism in its interface. One did not need to wade through massively dense graphics scrambled with text in disordered layouts to figure out what to do when Google first appeared. The focus was immediately obvious.

I am presenting this challenge to vendors; there is a need to satisfy a huge array of personal preferences while introducing a minimal amount of change in any one release. Easy adoption requires that new products be simple. Usefulness must be quickly obvious to multiple audiences.

I am presenting this challenge to technology users; focus your appetite. Decide before shopping or adopting new tools what would bring the most immediate productivity gain and personal adoptability for maximum efficiency. Think about how defeated you feel when approaching a new release of an upgraded product that has added so many new “bells and whistles” that you are consumed with trying to rediscover all the old functions and features that gave your workflow a comfortable structure. Think carefully about how much learning and re-adjusting will be needed if you decide on technology that promises to do everything, with unlimited personalization. It may be possible, but does it really feel personally acceptable.

New Content Globalization Case Study: Philips

All businesses are facing serious disruptions from shifting global economies, technical advancements, and the need for strong, consistently branded online multinational presence. Royal Philips Electronics of the Netherlands has found a way to respond to these challenges without jeopardizing its ongoing business.

A world leader in the consumer lifestyle, healthcare, and lighting industries, Philips integrates technologies and design into people-centric solutions, based on fundamental customer insights and the brand promise of “sense and simplicity.” With 50,000 products, 1,800 logos, a website present in 57 countries and translated in 35+ target languages, and 500 consumer marketing managers in the Consumer Lifestyle sector, Philips’ global brand management strategy requires an adaptive system of people, process, and technology to provide a unifying influence.

This case study tells the story of how Philips has met and is keeping pace with changing and often disruptive business environments by evolving operations and communications touchpoints in a just-in-time approach that maximizes global opportunity based on consumer need.

Download the Philips story here:

Borderless Brand Management: The Philips Strategy for Global Expansion

Give Financial Statements an MRI with XBRL

Recent news item:  CLEVELAND (AP) — Indians outfielder Grady Sizemore is feeling better and will have an MRI on his strained left elbow on Monday.

It has become very commonplace for doctors to order an MRI for patients experiencing pain. According to Radiology, Magnetic resonance imaging (MRI) is a noninvasive medical test that helps physicians diagnose and treat medical conditions.  The website goes on to say, “MR imaging uses a powerful magnetic field, radio frequency pulses and a computer to produce detailed pictures of organs, soft tissues, bone and virtually all other internal body structures. The images can then be examined on a computer monitor, printed or copied to CD.” (see

In a similar fashion, XBRL puts a company’s financial statements under a transformation that exposes detailed pictures of the underlying accounting backing every line item.  This information can then be analyzed and compared by computer software to help determine a company’s financial health.

For example, let’s look at a sample line item from the Marathon Oil SEC filing covering the quarter ending September 30, 2008.  The form 10-Q has a line on financial statement that reads:

Loss on early extinguishment of debt 120 (nine months ending September 30, 2007, in millions)

When you give that line item the XBRL MRI treatment, computers can extract the XBRL label:


the definition of the item:

Amount represents the difference between the fair value of the payments made and the carrying amount of the debt at the time of its extinguishment.
And the authoritative literature that backs up the accounting decisions:

the reference:

Presentation Reference

Name                  Accounting Principles Board Opinion (APB)
Number               26
Paragraph          20, 21
Publisher           AICPA

Inquiring minds will take a close look at APB 26 for more detail.  This examination should yield a much clearer understanding of the basis for reporting the number and therefore yield a better understanding of the financial statement.

Note:  Mr. Sizemore returned to full duty with the Cleveland Indians shortly after his MRI.

When is a Book Not a Book?

I recently wrote a short Gilbane Spotlight article for the EMC XML community site about the state of Iowa going paperless (article can be found here) in regards to its Administrative Code publication. It got me to thinking, “When is a book no longer a book?”

Originally the admin code was produced as a 10,000 page loose-leaf publication service containing all the regulations of the state. For the last 10 years it has also appeared on the Web as PDFs of pages, and more recently, independent data chunks in HTML. And now they have discontinued the commercial printing of the loose-leaf version and only rely on the electronic versions to inform the public. They still produce PDF pages that resemble the printed volumes that are intended for local printing of select sections by public users of the information. But the electronic HTML version is being enhanced to improve reusability of the content, present it in alternative forms and integrated with related materials, etc. Think mashups and improved search capabilities. The content is managed in an XML-based Single Source Publishing system that produces all output forms.

I have migrated many, many printed publications to XML SSP platforms. Most follow the same evolutionary path regarding how the information is delivered to consumers. First they are printed. Then a second electronic copy is produced simultaneously with the print using separate production processes. Then the data is organized in a single database and reformatted to allow editing that can produce both print and electronic. Eventually the data gets enhanced and possibly broken into chunks to better enable reusing the content, but the print is still a viable output format. Later, the print is discontinued as the subscription list falls and the print product is no longer feasible. Or the electronic version is so much better, that people stop buying the print version.
So back to the original question, is it no longer a book? Is it when you stop printing pages? Or when you stop producing the content in page-oriented PDFs? Or does it have to do with how you manage and store the information?

Other changes take place in how the information is edited, formatted, and stored that might influence the answer to the question. For instance, if the content is still managed as a series of flat files, like chapters, and assembled for print, it seems to me that it is still a book, especially if it still contains content that is very book oriented, like tables of contents and other front matter, indexes, and even page numbers. Eventually, the content may be reorganized as logical chunks stored in a database, extracted for one or more output formats and organized appropriately for each delivery version, as in SSP systems. Print artifacts like TOCs may be completely generated and not stored as persistent objects, or they can be created and managed as build lists or maps (like with DITA). As long as one version is still book-like, IMHO it is still a book.

I would posit that once the printed versions are discontinued, and all electronic versions no longer contain print-specific artifacts, then maybe this is no longer a book, but simply content.

« Older posts

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑