The Gilbane Advisor

Curated for content, computing, data, information, and digital experience professionals

Page 264 of 931

March Madness in the Search Industry

In keeping with conventional wisdom, it looks like a number of entrepreneurs are using the economic downturn as opportunity time, judging from the larger than normal number of announcements in the enterprise search sector. The Microsoft acquisition of FAST, Autonomy’s foray into the document/content management market, and Google’s Search Appliance ramping its customer base are old news BUT we have a sweep of changes. Newcomers to the enterprise search marketplace and news of innovative releases of mature products really perked up in March. Here are my favorite announcements and events in chronological order and the reasons why I find them interesting:

Travis, Paul. March 2, 2009 Digital Reef Comes Out of Stealth Mode. 03/02/2009. Byteandswitch.com.

Startup offers content management platform to index unstructured data for use in e-discovery, risk mitigation, and storage optimization. Here is the first evidence that entrepreneurs see opportunity for filling a niche vacuum. In the legal market the options have been limited and pretty costly, especially for small firms. This will be an interesting one to watch. http://www.digitalreefinc.com/

Banking, Finance, and Investment Taxonomy Now Available from the the Taxonomy Experts at WAND. 03/02/2009, PR Web (press release), Ferndale,WA,USA

The taxonomy experts at WAND have made this financial taxonomy available now for integration into any enterprise search software. I have been talking with Ross Lehr, CEO at Wand, for over a year about his suite of vertical market taxonomies and how best to leverage them. I am delighted that Wand is now actively engaged with a number of enterprise search and content management firms, enabling them to better support their customers’ need for navigation. The Wand taxonomies offer a launching point from which organizations can customize and enhance the vocabulary to match their internal or customer interests. http://www.wandinc.com/main/default.aspx

Miller, Mark. Lucid Imagination » Add our Lucene Ecosystem Search Engine to Firefox. 03/02/2009

I predicted back in January that open source search and search appliances were going to spawn a whole new industry of services providers and expert integrators because there are just not enough search experts to staff in-house experts in all the companies that are adopting these two types of search products. Well, it is happening and these guys at Lucid are some of the smartest search technologists around. Here is an announcement that introduces you to a taste of what they can do. Check it out and check them out at http://www.lucidimagination.com/

To see the full article with commentary about: social search at NASA, QueSearch, MaxxCat, Aardvark on social search, Attivio, ConceptSearching, Google user-group, Simplexo, Endeca, Linguamatics, Coveo, dtSearch and ISYS.

Microsharing has benefits for NASA. 03/04/2009.

It has been about 18 months since I wrote on social search and this report reveals a program that takes the concept to a new level, integrating content management, expertise locators and search in a nifty model. To learn more about NASAsphere, read this report written by Celeste Merryman. Findings from the NASAsphere Pilot. Jet Propulsion Laboratory, California Institute of Technology Knowledge Arciteture (sic) and Technology Task [Force]. 08/20/2008. The success of the pilot project is underscored in this report recommendation: the NASAsphere pilot team recommends that NASAsphere be implemented as an “official” employee social networking and communication tool. This project is not about enterprise search per se, it just reflects how leveraging content and human expertise using social networks requires a “findability” component to have a successful outcome. Conversely, social tools play a huge role in improving findability.

March 16, 2009. QueSearch: Unlocking the Value of Structured Data with Universal Search really caught my eye with their claim to “universal search” (yes, another) for large and mid-size organizations.

This offering with a starting price of $19,500, is available immediately, with software and appliance deployment options. I tried to find out more about their founders and origins on their Web site without luck but did track down a Wikipedia article and a neat YouTube interview with the two founders, Steven Yaskin and Paul Tenberg. It explains how they are leveraging Google tools and open source to deliver solutions.

Stronger, Better, Faster — MaxxCat’s New Search Appliance Aspires to Be Google Search Appliance Killer, by Marketwire. 03/11/2009.

This statement explains why the announcement caught my attention: MaxxCat product developers cite “poor performance and intrinsic limitations of Google Mini and Google Search Appliance” as the impetus to develop the device. The enterprise search appliance, EX-5000, is over seven times faster than Google Search Appliance (GSA) and the small business search appliance, the XB-250, is 16 times faster than Google Mini. There is nothing like challenging the leading search appliance company with a statement like that to throw down the gauntlet. OK I’m watching and will be delighted to read or hear from early users.

Just one more take on “social search” as we learn about Aardvark: Answering the Tough Questions, David Hornik on VentureBlog. 03/12/2009

This week the Aardvark team is launching the fruits of that labor at South By Southwest (SXSW). They have built a “social search engine” that lives inside your IM and email. It allows you to ask questions of Aardvark, which then goes about determining who among your friends and friends of friends is most qualified to answer those questions. As the Aardvark team point out in their blog, Social Search is particularly well suited to answer subjective questions where “context” is important. I am not going to quibble now but I think I would have but this under my category of “semantic search” and natural language processing. Until we see it in action, who knows?

A new position at Attivio was announced on March 16th, Attivio Promotes John O’Neil to Chief Scientist, which tells me that they are still expanding at the end of their first official year in business.

Getting to the point, 03/18/2009, KMWorld. http://www.kmworld.com/Articles/ReadArticle.aspx?ArticleID=53070

Several announcements about Concept Searching’s release v. 4 of its flagship product, conceptClassifier for SharePoint highlight the fact that Microsoft’s acquisition of FAST has not slowed the number of enterprise search solution companies that continue to partner with or offer independent solutions for SharePoint. In this case the company offers its own standalone concept search solution applications for other content domains but is continuing to bank on lots of business from the SharePoint user community. This relationship is reflected in these statements: The company says features include a new installer that enables installation in a SharePoint environment in less than 20 minutes, requires no programmatic support and all functionality can be turned on or off using standard Microsoft SharePoint controls. Full integration with Microsoft Content Types and greater support for multiple taxonomies are also included in this release. Once the FAST search server becomes a staple for Microsoft SharePoint shops, there will undoubtedly be fallout for some of these partners.

Being invited to the Google Enterprise Search Summit in Cambridge, MA on March 19, 2009 was an opportunity for me to visit Google’s local offices and meet a bunch of customers.

They were a pretty enthusiastic crowd and are enjoying a lot of attention as this division of Google works to join the ranks of other enterprise application software companies. I suspect that it is a whole new venture for them to be entertaining customers in their offices in a “user-group like” forum but the Google speakers were energetic and clearly love the entrepreneurial aspects of being a newish run-away success within a run-away successful company. New customer announcements continue to flow from Google with SITA (The State Information Technology Agency in South Africa) acquiring GSA to drive an enterprise-wide research project. The solution will also be deployed and implemented by JSE-listed IT solutions and services company Faritec, and RR Donnelly. Several EMC users were represented at the meeting, which made me ask why they aren’t using the search tools being rolled out by the Documentum division…well, don’t ask.

Evans, Steve. Simplexo boosts public sector search options. Computer Business Review – UK. 03/18/2009.

This is interesting as an alternative to the Lucene/solr scene, UK-based open source enterprise search vendor Simplexo has launched a new search platform aimed at the public sector, which aims to enable central and local government departments to simultaneously search multiple disparate data sources across the organisation on demand. I have wondered when we would see some other open source offerings.

And all of the preceding is about just the startups (plus EMC at Google) and lesser known company activity. This was not a slow month. I don’t want all my contacts in the “established” search market to think that I am not paying attention because I am. I’ve exchanged communications with or been briefed by these known companies with news about new releases, advancing market share, or new executive teams. In no particular order these were the highlights of the month:

Endeca announced three new platforms on Mar 23, 2009: Endeca Announces the Endeca Publishing Suite, Giving Editors Unprecedented Control Over the Online Experience; Endeca Announces the Endeca Commerce Suite, Giving Retailers Continuous Targeted Merchandizing; and Endeca Unveils McKinley Release of the Information Access Platform, Allowing for Faster and Easier Deployment of Search Applications

Linguamatics Agile Text Mining Platform to Be Used by Novo Nordisk. 03/26/2009

I had a fine briefing by Coveo’s CEO Laurent Simoneau and Michel Besmer new VP of Global Marketing and see them making great strides capturing market share across numerous verticals where rapid deployment and implementation are a big selling point. They also just announced: Bell Mobility and Coveo Partner to Create Enterprise Search from Bell, an Exclusive Enterprise-Grade Mobile Search Solution.

A new Version 7.6 of a mainstay, plug-and-play search solution for SMBs since 1991, dtSearch, was just released. 3/24/2009

And finally, ISYS is having a great growth path with a new technology release, ISYS File Readers, new executives and a new project … completed in conjunction with ArnoldIT.com. Steve Arnold, industry expert and author of the Beyond Search blog, compiled more than a decade of Google patent documents. To offer a more powerful method for analyzing and mining this content, we produced the Google Patent Search Demonstration Site, powered by our ISYS: web application.

Weatherwise, March, 2009 is out like a lamb but hot, hot, hot when it comes to search.

Quark Teams with IBM Enterprise Content Management to Bring XML and DITA to the Masses

Quark announced that it has teamed with IBM Enterprise Content Management (ECM) to enable the broad adoption of XML across the enterprise by integrating Quark XML Author with IBM FileNet Content Manager. Quark makes it possible for any IBM FileNet Content Manager user working in Microsoft Word to author intelligent content that can be reused and delivered to multiple channels or formats. The ability to author, manage, and reuse structured content enables critical business needs, such as managing intellectual property, complying with regulatory mandates, and automating business processes. A simple and streamlined process for XML authoring also helps organizations to enable enterprise-wide adoption of XML and DITA. Quark XML Author for Microsoft Word is an XML authoring tool that allows users to create XML content in a familiar word processing environment. Quark XML Author enhances Microsoft Word’s native XML support by allowing users to create narrative XML documents directly, without seeing tags, being constrained to boxes, or being aware of the technical complexities associated with XML. http://www.quark.com/

Second Life Gets an International Life: An Interview with Danica Brinton of Linden Lab

At the recent Worldware Conference in Santa Clara, California, I was delighted to learn about how a high-tech company was achieving great success in internationalizing their software through crowdsourcing. The story gets more interesting. This was not back-room software plumbing but an innovative application, none other than Second Life, a virtual world and a social-networking MMORG (Massive Multi-Player Online Role-Playing Game).  Launched by Linden Lab in 2003, Second Life enables its users, called residents, to interoperate with a virtual world  through software called a Second Life Viewer. Residents can socialize, participate in group activities, and create and trade virtual property.  According to Google, there are over 9 million residents currently on Second Life.

I attended the presentation, “Brave New (Virtual) World,” and had an opportunity to catch up with Danica Brinton, Director of International Strategies and Localization at Linden Lab.  Here’s what she had to say.

Kadie:  When did Linden Lab realize the importance of internationalization?

Brinton: Around the middle of 2008, Linden Lab realized some discrepancies between U.S. and international business.  While 60% of the residents and twice the new registrations were from outside the U.S., revenue and retention numbers, while still healthy, indicated a gap in the localized  user experience.

Kadie: What happened when you entered the scene?

Brinton: I joined the company in June.  When I checked things out, I was stunned.  I discovered that we were paying $40,000 per quarter to LSPs.  What were we getting?  The viewer was translated only partially into 3 languages, and was nearly incomprehensible.  The website was translated partially into 2 key languages.  In both cases there were a lot of localization bugs.  On the flip side, hundreds of wiki-based Help pages were translated quite well into 8 languages, which was pretty darn good.  An interesting trend…

Kadie: So what did you do?

Brinton: Although we were a small company, when I showed my management the opportunity they were very supportive…but with limited funding.  So we had to get creative.  We enlisted the help of power users to translate the application and website.  To ensure quality control, we set up a repeatable localization framework, with translation, editing, testing, and end user review.  We established a tier system of resident translators, drawing on our super-users.   We built and acquired localization tools to manage translation memories and the localization process, and installed a locale-based ROI calculator to manage costs.  Finally, we hired 3 in-house linguists.  So you can see, it was a hybrid of crowdsourcing from the Second Life community on the one hand, and our in-house linguists and contracted translation agencies on the other.

Kadie: How did you divide up the work?

Brinton:  Who did what depended on the language tier.  Let’s look at the viewer, for example.  For tier-1 languages, we developed the glossary, did the translation, and collaborated with the Second Life community on the editing, QA, and some of the glossary.  For tier-2 languages, the Second Life community did nearly everything.

Kadie: What kind of results did you achieve?

Brinton: Less than a year later, I can truthfully say that we achieved some dramatic results.  We now translate the viewer and the website into 10 languages, and expect to reach 16 in May.  The active residents from outside the U.S. grew to 64% of the user base, and new registrations are now more than 2.5 times the U.S.  Even better, international revenues have surpassed U.S. domestic revenues.  Between the Viewer, the website, and the knowledge base, we now regularly localize over 150,000 words per language.

Kadie: What’s next for localization at Linden Lab?

Brinton: Strangely enough, past is prologue.  This new localization program is helping to increase customer satisfaction and bolster an affinity group.  You can even say that community-driven translation is building brand advocacy.  Some of the elite power users are evolving into business partners.  Localization is not only supporting our business, it’s helping to grow it.

The Content Globalization practice at the Gilbane Group closely follows and  blogs on the role of multilingual communication in social networking (see interview with Plaxo).

EMC Announces Solutions for Case Management and Document Management

EMC Corporation (NYSE:EMC) announced two new enterprise content management (ECM) solutions: A new EMC Documentum Case Management Solution Framework, which accelerates the development of case management applications, and EMC Documentum ApplicationXtender 6 with new modules for workflow and retention management. Case management is a pattern of work commonly used in almost all industries, such as financial services, public sector and healthcare, that requires a group of people to systematically process and collaborate on a case folder that consists of both content and data. The Case Management Solution Framework enables system integrators and application partners as well as internal IT developers to build case management solutions faster. The Case Management Solution Framework includes: pre-integrated technologies utilizing the core capabilities of the EMC Documentum platform, a tutorial that illustrates the creation of  a case management solution, a case management sample application and other best practices covering, application development and deployment, and an Express Install tool which provides single-click installation of all the components needed for case management on a single server.

ApplicationXtender is a departmental document management solution for organizations with limited IT budgets and supporting resources. The solution provides out-of-the box capabilities that allow departments to manage content such as images, documents and reports. Built on a central repository, ApplicationXtender provides capabilities for high-speed image capture and storage and is designed for quick deployment. New features in ApplicationXtender 6 include: A Workflow Manager module for creating departmental workflow solutions designed to improve efficiency in business processes, built on Microsoft .NET, and: A Retention Manager module for applying automatic retention capabilities such as holds, reporting, disposals and audits trails to documents to ensure regulatory compliance. The Case Management Solution Framework and ApplicationXtender 6 are available now. http://www.emc.com/

Vignette Releases Community Applications 7.1

Vignette Corporation (NASDAQ: VIGN) announced an enhancement to its Web Experience Platform – Vignette Community Applications 7.1. This integrated Social Media Solution enables organizations to build communities, encourage online interaction, boost campaigns and provide analytics. Vignette Community Applications 7.1, a key component of Vignette’s Social Media Solution, provides  tools to increase online interaction. In addition to enabling the creation and support of Web 2.0 capabilities such as blogs, wikis, forums, ratings and reviews, the solution allows companies to create unique social sites. These flexible sites combine microsite features with social-centric benefits such as idea management, calendars and events and the sharing of multimedia-rich assets including videos and podcasts. The adaptable social site templates allow marketers to quickly launch campaigns, communities and product sites. Vignette’s Social Media Solution provides a search engine, video technology, enterprise-grade scalability and a flexible, standards-based presentation technology that allows companies to combine social media elements with content from multiple sources. Vignette Community Applications 7.1 is available immediately. http://www.vignette.com

 

Tell Us About Your Favorite Web 2.0 Tool

There sure is a lot of news about Web 2.0 these days. It can be hard to take it all in, and there seems to be new tools every day! So how to make sense of it all.

One way to learn more about these tools is to attend the session I will be hosting at the Gilbane San Francisco Conference (http://gilbanesf.com) in June called “My Favorite Web 2.0 Tool“. It will be organized in the fast paced “Lightning Round” style, with 10 speakers covering 10 topics in 60 minutes (yes, that is about 5 minutes each). This unique presentation format allows for presentation of many ideas at once, encourages audience participation, and tends to be fairly hilarious.

Got something to say about Web 2.0 tools? I would love to hear from people interested in participating in this lightning round. Send me a one paragraph description of why your favorite Web 2.0 tool should be included in this session (send to dale@gilbane.com). We’re open to a broad definition of Web 2.0 tools too. We are looking for innovative ideas, game changers, or even just entertaining or fun apps!

We would love to hear from you!. The slots will fill up fast so don’t wait if you hope to participate.

See you in San Francisco!

An Information Parable

With apologies to S. I. Hayakawa, whose classic "A Semantic Parable" has been a staple of virtually everyone’s education for more than a half-century.

Not so long ago nor perhaps all that far away, there existed a need for several departments of a huge organization to share information in a rapid and transparent way so that the business of the organization could be improved and its future made more secure.

Now each of these departments understood and agreed with the basic need for sharing, so no one expected there to be problems achieving the desired results.  Each department had its own IT group, working diligently using best practices of information technology as they understood them. When the need for information sharing among the departments became evident, the executive managers called a meeting of IT, operating managers and lead technologists from each department.  At this meeting, the executives explained that the need for a more transparent and flexible information environment among the departments and with the world outside.  Everyone nodded their agreement.

The IT manager of a major department exclaimed; "what we need is an enterprise information architecture; an EIA." Most of the other IT representatives agreed, and an effort to develop such an architecture was begun right there. The initiating department stated that because it had the largest and most mature IT infrastructure, the EIA should be modeled on technology approaches it was using.  Several other departments agreed–they had already adopted similar IT approaches and could easily participate in such an EIA.  Some other departments, however, having gone down different paths in their IT planning, took some issue with this suggestion. They feared that changing course to come in line with the suggested architecture could seriously disrupt their existing IT plan, funding and staffing.  Although willing to be good citizens, they were mindful that their first responsibility was to their own department.

More discussion ensued, suggesting and examining different IT concepts like J2EE, SOA, SQL, Web-centricity, BPR, and so on.  Several departments that had software capable of supporting it even mentioned XML. Like a Chinese puzzle, the group always found itself just short of consensus, agreeing on the basic concepts but each bringing variations in implementation level, manufacturer, etc., to the discussion.  In the end, tempers frayed by the seemingly endless circular discussions, the group decided to table further action until more detail about the need could be developed. Actually, nearly everyone in the room knew that they probably were, at that moment, as close to consensus as they were likely to get unless the top managers chose and mandated a solution. Anticipating just such a mandate, nearly every department descended on top management to make the case for its particular IT and EIA approaches, or , sensing defeat, for an exemption from whatever the decision turned out to be. The top managers of course, who knew little about the details of IT, were affected most by the size and clout of the departments beseeching them and by the visibility of the IT vendors they touted.  Battle lines were drawn between groups of departments, some of whom even went so far as to turn their vendors loose on top management to help make the case for their approach. Like molasses in winter, the entire situation began to congeal, making any movement-or communication among the departments for that matter-unlikely.  In the midst of this growing chaos, the original need to share information-and the information itself-was almost completely forgotten.

Then, when things looked terminal, someone from a department operating staff suggested that maybe things would work better if the organization just developed and adopted standards for the information to be exchanged and didn’t try to develop anything so far-reaching as an entire Enterprise Information Architecture. At first, no one listened to this obviously "un-IT" suggestion, but as things got worse and progress seemed out of reach, top management asked why the suggestion shouldn’t be considered.  After much grumbling, a meeting was called in which the staff making the suggestion laid out their ideas:

  • First, they said, we should decide what information must be exchanged among departments. We can do this based on our knowledge of the information content itself so we won’t need a great deal of technical skill beyond an understanding of the information standard selected.
  • Next, we might decide what interchange format will be use to exchange the information. It will be important that this format be capable of easy creation and ingestion by the IT tools in each participating department. XML seems to be a growing interchange format so maybe we should consider XML.
  • Then we can document what and how we want to exchange, and publish the documentation to every department so that their staffs can attend to the task of exporting and importing the desired information elements, taking care to avoid asking the departments to use any particular technology to accomplish this exchange, but with the easy availability of XML tools, that shouldn’t be difficult.
  • Then we may want to set some deadlines by which the various departments must be able to exchange information in the format we choose. That will ensure that the entire effort keeps moving and will help flush out problems that need more resources. Maybe if we just tell them the results we need, they won’t be so likely to resist.
  • Finally, we ask the various IT staffs to come up with their own technological approaches to the act of sharing: intranet, Internet, VPN, etc. They’re really good at this and they should have the say as to how it is done for their department.

After the presentation, there was silence in the room followed by some mildly contemptuous grumbling from some of the IT staff members in the back.

How, they whispered, could a complex challenge like integrating the organization’s IT systems into an EIA be dealt with by a few simplistic rules about data formats?   Finally, one of these malcontents gave voice to this objection, to which the presenter replied that the entire idea was to avoid impact on the complex ongoing IT activities of the various departments. The goal, he said, was to articulate what the organization needed in terms of information, leaving the approaches for its provision to each department’s IT staff. This, he said, would hopefully provide a level at which consensus could be reached, technologically based consensus having proven elusive for many reasons, some quite serious.

Sometimes, he said, it isn’t as important to create the impetus to force consensus, as it is to develop a rationale on which that consensus can be achieved and accepted voluntarily by the players. In the case of our hypothetical organization, there were reasons why the technological lives of the departments would never fully coincide and why each would resist even the weight of management dictates to do so. There were not, however, the same reasons why these departments could not agree on what the organization needed in shared information, if each department would be allowed to support the sharing in its own way.

The group thought about this radical departure from good systems engineering disciplines and began to realize that perhaps some integration challenges cannot be met by traditional (hard) systems and technology approaches–in fact, it may have taken quite some time and more conversations to reach this point. When this had finally penetrated, the departments agreed to base their collaboration on information itself, began the joint process of building needed interchange foundations, actually working with the operating staffs who created, used and understood the information–they chose XML and each departm
ent found that it had significant XML resources in the software it already used–and went back to work confident that they would be asked to give up neither their hard-won IT environment nor their autonomy as professionals.

As for the organization as a whole, over the next year or so it saw its information sharing begin to improve, spent relatively little of money doing it… and it was able to continue the practice of having all-hands holiday parties at which the department IT staffers and operating folks spoke to one another.

Webinar: Global Content and Customer Satisfaction

April 21, 11:00 am ET

A solid strategy for weathering any economic storm is to forcus on finding and serving your most profitable customers. In any region, in any language, across all interactions. How can global enterprises tune their content practices to support this new laser focus on audience engagement and align their processes with corporate strategic objectives?

Gilbane’s Mary Laplante and Sophie Hurst, Director, Product Marketing at SDL, discuss the issues, challenges and opportunities associated with delivering multilingual content that meets today’s mandate for extraordinary customer experience. Using Gilbane’s research and insights on aligning global content with business value as background, topics include:
 
  • Market factors influencing global content management practices.
  • Real-world approaches to meeting audience demand for multilingual content, based on Gilbane research and SDL customer solutions.
  • Establishing a roadmap for enhancing global content practices to align them more closely with customer experience initiatives.

Registration is open. Sponsored by SDL.

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑