Near-Time announced the availability of Near-Time Connection, an extension of Near-Time’s collaboration and publishing capabilities for mobile devices and other Web platforms. Near-Time Connection packages the functionality of Near-Time into a Widget, providing a flexible way to access content and interact with users associated with Near-Time spaces from smart phones, blogs or personalized homepage portals like iGoogle. Near-Time Connection is free to Near-Time users. Near-Time Connection gives users an interactive platform that lets them stay plugged into their Near-Time community no matter how they choose to view their content. The authoring environment, similar to that of Near-Time’s desktop offerings, enables users to remain active in their Near-Time communities when on the road or using a homepage portal. Users can embed other Near-Time Widgets, tag content for better search capabilities, and follow comment summaries, threads and Near-Time picks. http://www.near-time.com/
Month: March 2008 (Page 4 of 4)
Our Gilbane San Francisco conference from June 18-20 extends our discussion of global content to the West Coast. We’ll be talking about the ability to create, define and manage a Global Content Value Chain within two distinct operational areas: customer service and brand management, both highly dependent on accurate, consistent, and contextual multilingual communications.
We’ll also provide content professionals with a succinct knowledge map of translation process and technology components, increasingly handy as the content and translation management worlds collide. Then, onto an update on system integration opportunities based on enterprise strategy rather than ad-hoc processes. Join us!
GCM-1: Optimizing the Global Content Value Chain: Focus on Product Support Content
Wednesday, June 18: 2:00 – 3:30pm
Product support content includes technical documentation as well as the content that lives with a product or service in many formats and contexts, including pre-sales, post-sales, aftermarket, training, and service. The global economy adds languages as yet another output to the traditional multichannel formula, increasing content volume due to the nuances of dialect and culture. Speakers explain how to build global content value chains that combine core content technologies with heavy doses of authoring assistance, collaboration, automated workflows and project management to documentation and translation processes. Results include multilingual product content that satisfies customers, enables simultaneous shipment of products worldwide, and delivers cost and operational efficiencies.
GCM-2: Optimizing the Global Content Value Chain: Focus on Web Content
Thursday, June 19: 8:30 – 10:00am
Customer-facing Web content must consistently communicate an organization’s core brand regardless of the language through which the message is delivered. The integral role of company Web sites in engaging with customers worldwide means that effective management of multilingual Web content must be central to content and IT strategies. Effectively managing this content presents specialized considerations such as understanding the benefits of machine translation, integration with analytics and search engine optimization tools, and segment-based translation that keeps multiple Web sites in multiple languages in synch with customer expectations. Speakers explain how to build global content value chains that combine brand management techniques with web content creation, management and distribution processes. The result is multilingual Web content that ensures the best brand experience in any language, at any time.
GCM-3: Case Studies in Translation and Localization: Process and Technology Overview for Content Managers
Thursday, June 19: 11 – 12pm
The worlds of language professionals, content managers, program and product managers, and IT are colliding, driven by the growing demand for integrating content management, translation process management, and other processes and practices comprising the global content value chain. The collision can be managed more effectively if all participants understand what’s in the toolboxes of the other groups and how to put them to good use in the context of a total solution. In a case study format, language professionals explain their tools of the trade and show you how they add value to multilingual content. A session in partnership with Multilingual Magazine and Localization World.
GCM-4 & WCM-6: Case Studies in Integration: WCM & GM
Thursday, June 19: 3:30- 5pm
Content and translation management are core processes in the global content value chain. Integrating the systems that handle them is essential to streamlining processes, increasing the volume of language translations, controlling costs, improving efficiencies and ensuring customer satisfaction. To make the most of investment in people, process, and technology, integration of WCM and GM requires an enterprise strategy, not ad hoc processes that are recreated each time a new website is launched. This session uses real-world scenarios to walk you through different approaches to integration so that you can make an informed decision about strategies and practices that are right for your organization.
IBM announced new services to help organizations use collaboration and social networking to maximize employee talent and performance. The Enterprise Adaptability services include a methodology to determine the return on investment of social networking, use of large scale communications programs to jumpstart adoption, automatic identification of key talent, and social network analysis. With Enterprise Adaptability companies can learn how to embed Web 2.0 technologies into the fabric of business operations, allowing employees, partners and customers to communicate, establish new business relationships and make real-time decisions within the context of their everyday work. Enterprise Adaptability combines software from IBM Lotus Software and FileNet with consulting from IBM Global Business Services’ Human Capital Management practice. http://www.ibm.com
Mzinga announced it has acquired Littleton, MA-based Prospero Technologies. Prospero has a client base in the media, entertainment, and publishing sectors, and owns online community properties Talk City and Delphi Forums. With this acquisition, Mzinga now offers a broad suite of workplace and customer community solutions. The two companies bring together a combination of business social media technology, in-depth moderation services, and domain expertise focused on improving business processes for marketing, human resources, and customer service professionals. Mzinga’s software-as-a-service technology platform provides a suite of social media applications, flexible integration options, and enterprise-grade moderation capabilities to help ensure efficient, secure control of user-generated content. The platform allows flexibility in the way social media technology is deployed. Using “widget” technology, one or more social media elements may be seamlessly embedded into a customer’s web site, while template functionality can be used to create private-label community destination sites. Customers also have the option of crafting their own look and feel, and integrating social media features via API into their web pages or other applications. This social media technology will be integrated into Mzinga’s learning platform. http://www.mzinga.com
Recent studies describe the negative effect of media including video, television and on-line content on attention spans and even comprehension. One such study suggests that the piling on of content accrued from multiple sources throughout our work and leisure hours has saturated us to the point of making us information filterers more than information “comprehenders”. Hold that thought while I present a second one.
Last week’s blog entry reflected on intellectual property (IP) and knowledge assets and the value of taxonomies as aids to organizing and finding these valued resources. The idea of making search engines better or more precise in finding relevant content is edging into our enterprises through semantic technologies. These are search tools that are better at finding concepts, synonymous terms, and similar or related topics when we execute a search. You’ll find an in depth discussion of some of these in the forthcoming publication, Beyond Search by Steve Arnold. However, semantic search requires more sophisticated concept maps than taxonomy. It requires ontology, rich representations of a web of concepts complete with all types of term relationships.
My first comment about a trend toward just browsing and filtering content for relevance to our work, and the second one about the idea of assembling semantically relevant content for better search precision are two sides of a business problem that hundreds of entrepreneurs are grappling with, semantic technologies.
Two weeks ago, I helped to moderate a meeting on the subject, entitled Semantic Web – Ripe for Commercialization? While the assumed audience was to be a broad business group of VCs, financiers, legal and business management professionals, it turned out to have a lot of technology types. They had some pretty heavy questions and comments about how search engines handle inference and its methods for extracting meaning from content. Semantic search engines need to understand both the query and the target content to retrieve contextually relevant content.
Keynote speakers and some of the panelists introduced the concept of ontologies as being an essential backbone to semantic search. From that came a lot of discussion about how and where these ontologies originate, how and who vets them for authoritativeness, and how their development in under-funded subject areas will occur. There were no clear answers.
Here I want to give a quick definition for ontology. It is a concept map of terminology which, when richly populated, reflects all the possible semantic relationships that might be inferred from different ways that terms are assembled in human language. A subject specific ontology is more easily understood in a graphical representation. Ontologies also help to inform semantic search engines by contributing to an automated deconstruction of a query (making sense out of what the searcher wants to know) and automated deconstruction of the content to be indexed and searched. Good semantic search, therefore, depends on excellent ontologies.
To see a very simple example of an ontology related to “roadway”, check out this image. Keep in mind that before you aspire to implementing a semantic search engine in your enterprise, you want to be sure that there is a trusted ontology somewhere in the mix of tools to help the search engine retrieve results relevant to your unique audience.