The Gilbane Advisor

Curated for content, computing, and digital experience professionals

Page 313 of 918

Adobe Digital Editions

At O’Reilly’s Tools of Change Conference (TOC), Adobe made a very unAdobelike product announcement. Their new Digital Editions is very impressive!! I’ve never been a huge PDF fan because it is so stubbornly page centric in a world where pages are becoming much less important to the display of content. It has often awkward and painful to read PDFs on computer screens and hand held devices.

This new technology is based upon the IDPF EPUB standard that has been developed as the universal distribution format for reflow-centric content. The dynamic layout capability is amazingly agile as it reflows content from large to small screens with excellent speed and seemingly miminal effort. Adobe is currently mum about whether it will be included in the IPhone launch.

Digital Editions has optional DRM capability and will support contextual advertising, subscription and membership based business models. It features the expected compatibility with PDF and InDesign CS3.

The functionality and openness to industry standards are a radical departure from many of Adobe’s traditional practices. Bill McCoy, General Manager- ePublishing Business explains that the MacroMedia acquisition played a major role in making this strategic transition possible. This is more evidence that the Macromedia acquisition was one of the better acquisitions in recent memory.

The relatively small download (under 3MB) can be found at: www.adobe.com/products/digitaleditions.

Results: Globalization and Brand Management Poll

The results are in — and they’re not surprising. Well, actually one is. A mere 35% of respondents indicated that their companies have a formal brand management team. The result to our second question, “Does the team include a localization or translation subject matter expert?” was a resounding 100% “No.” This, unfortunately, is the “not surprising” part. Although our N was smaller than we’d like, we expect that the trend would have continued on the same course.

The fact is, most companies have work to do to ensure that corporate brand flows through multi-geographical market segments in a way that’s both consistent and relevant to customers and prospects in specific cultures and locales. It’s not easy. According to Economist Intelligence Unit, authors of Guarding the Brand, almost half of their respondents believed expanding into new territories made brand management all the more difficult. The top two challenges? 63 percent cited cultural differences and 44 percent cited language barriers and translations issues.

It’s sometimes “easier” to avoid dealing with the presence of some 4000+ languages worldwide, but it’s not so easy to ignore when one investigates the facts in smaller “chunks” so to speak. Consider this list of “The 50 Most Widely Spoken Languages” as a more easily digestible example.

If your company aims to expand footprint and revenue generation in this “flat world,” globalization needs to be a part of the brand management discussion. And if you are responsible for leading the charge into a new geographic region — you need to have a voice that’s heard.

Respect for Complexity and Security are Winners

I participated in one search vendor’s user conference this week, and a webinar sponsored by another. Both impressed me because they expressed values that I respect in the content software industry and they provided solid evidence that they have the technology and business delivery infrastructure to back up the rhetoric.

You have probably noted that my blog is slim on comments about specific products and this trend will continue to be the norm. However, in addition to the general feeling of good will from Endeca customers that I experienced at Endeca Discover 2007, I heard clear messages from sessions I attended that reinforced the company’s focus on helping clients solve complex content retrieval problems. Complexity is inherent in enterprises because of diversity among employees, methods of operating, technologies deployed and varied approaches to meeting business demands at every level.

In presentations by Paul Sonderegger and Jason Purcell care was given to explain Endeca’s approach to building their search technology solutions and why. At the core is a fundamental truth about how organizations and people function; you never know how a huge amount of unpredictably interconnected stuff will be approached. Endeca wants its users to be able to use information as levers to discover, through personalized methods, relationships among content pieces that will pry open new possibilities for understanding the content.

Years ago I was actively involved with a database model called an associative structural model. It was developed explicitly to store and manipulate huge amount of database text and embodied features of hierarchical, networked and relational data structures.

It worked well for complex, integrated databases because it allowed users to manipulate and mingle data groups in unlimited ways. Unfortunately, it required a user to be able to visualize the possibilities for combining and expressing data from hundreds of fields in numerous tables by using keys. This structural complexity could not easily be taught or learned, and tools for simple visualization were not available in the early 1980s. As I listened to Jason Purcell describe Endeca’s optimized record store, and concept of “intra-query” to provide solutions for the problems posed by double uncertainty I thought, “They get it.” They have acknowledged the challenge of making it simple to update, use and exploit vast knowledge stores; they are working hard to meet the challenge. Good for them! We all want flexibility to work the way we want but if it is not easy we will not adopt.

In a KMWorld webinar, Vivisimo’s Jerome Pesente and customer Arnold Verstraten of NV Organon co-presented with Matt Brown of Forrester Research. The theme was search security models. Besides the reasons for up-front consideration for security when accessing and retrieving from enterprise repositories, three basic control models were described. All three were based on access control lists (ACLs), how and why they are used by Vivisimo.

Having worked with defense agencies, defense contractors and corporations with very serious security requirements on who can access what, I am very familiar with the types of data structures and indexing methods that can be used. I was pleased to hear the speakers address trade-offs that include performance and deployment issues. It served to remind me that organizations do need to be thinking about this early in the selection process; inability to handle the most sensitive content appropriately should eliminate any enterprise search vendor that tries to equivocate on security. Also, as Organon did, there is nothing that demonstrates the quality of the solution like a “bake-off” against a sufficient corpus of content that will demonstrate whether all documents and their metadata that must not be viewed by some audiences in fact are always excluded from search results for all in those restricted audiences. Test it. Really test it!

After Enterprise 2.0

The Enterprise 2.0 conference is winding down in Boston today so it’s a good time to reflect on the “two dot oh” phenomena. As industry watchers and marketers, we’ve come a long way since Tim O’Reilly coined the concept two years ago. With hordes of people crowded into the new Westin Hotel in South Boston and the exhibit hall packed like Grand Central at rush hour (I’ve yet to hear an attendance count) I felt a rush of excitement in the air. There’s a certain sense of headiness when talking with entrepreneurs about their latest products and solutions. I was floored by the breadth of creativity.
But what struck me most is the vision thing – the gulf between the claims about Enterprise 2.0 and the realities of how work gets done. Enterprise 2.0 seems to be about blogging for a living, putting up a wiki, realizing that email is broken, and communicating with customers. Oh yes, then there’s unleashing the power of teams, user generated content, and building communities. The list goes on . . .
Yet two words are missing – management and process. In our always on world, we are inundated with information, and constrained by the limits of the twenty-four hour day. We need to take a hard look at how sharing information online creates new sources of value and better modes of organization. David Weinberger said it best in his opening keynote – everything is now metadata. We need to figure out how to harness this incredible openness at our fingertips. Developing a compelling information architecture is going to be even harder than an effective technical architecture.
What comes next? I’m now planning the collaboration and social computing track for our fall conference. (Stay tuned, we’ll be announcing the program later this summer.) I think we need to take a look at the hard issues of designing collaborative business processes. What do you think? I’m open to suggestions. Let me know, beginning by responding to this post. I look forward to our continuing conversation.

Where is the “L” in Web 2.0?

I was only able to make it into the Enterprise 2.0 conference in Boston yesterday. You can still get a demo pass for today. But I was thrilled to hear analyst, researchers, case study presenters, and yes, even vendors, drill down into one of my favorite phrases: “people, processes, and technology make it possible” and hope the mantra continues today.

Point being, obviously, that 2.0 not just about technology ;-). Its about culture, filling generation gaps, the evolution of *people* networking, and redefining community from the core of where community starts. Humans.

What I didn’t hear, however, is the “L” word — specifically language, and that bothered me. We just can’t be naive enough to think that community, collaboration, and networking on a global scale is solely English-driven. We need to get the “L” word into the conversation.

My globalization practice colleague Kaija Poysti weighs in here.

More data on Facebook users and Enterprise 2.0

Here is a chart including the data from the poll described yesterday from 500 25-34 year old facebook users combined with the results from the same poll given to 500 18-24 year old facebook users. There is certainly a difference. But the most surprising results are the extremely low expectations about the use of blogs and wikis, and even social networking software. These findings, informal as they are, would make me very nervous if I were a start-up hoping to make it by capturing the facebook generation as they stream into the workforce.

Where is the “L” in Web 2.0?

I was only able to make it into the Enterprise 2.0 conference in Boston yesterday. You can still get a demo pass for today. But I was thrilled to hear analyst, researchers, case study presenters, and yes, even vendors, drill down into one of my favorite phrases: “people, processes, and technology make it possible” and hope the mantra continues today.
Point being, obviously, that 2.0 not just about technology ;-). Its about culture, filling generation gaps, the evolution of *people* networking, and redefining community from the core of where community starts. Humans.
What I didn’t hear, however, is the “L” word — specifically language, and that bothered me. We just can’t be naive enough to think that community, collaboration, and networking on a global scale is solely English-driven. We need to get the “L” word into the conversation.
My globalization practice colleague Kaija Poysti weighs in here.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑