Curated for content, computing, and digital experience professionals

Year: 2007 (Page 16 of 45)

Results: Globalization and Brand Management Poll

The results are in — and they’re not surprising. Well, actually one is. A mere 35% of respondents indicated that their companies have a formal brand management team. The result to our second question, “Does the team include a localization or translation subject matter expert?” was a resounding 100% “No.” This, unfortunately, is the “not surprising” part. Although our N was smaller than we’d like, we expect that the trend would have continued on the same course.

The fact is, most companies have work to do to ensure that corporate brand flows through multi-geographical market segments in a way that’s both consistent and relevant to customers and prospects in specific cultures and locales. It’s not easy. According to Economist Intelligence Unit, authors of Guarding the Brand, almost half of their respondents believed expanding into new territories made brand management all the more difficult. The top two challenges? 63 percent cited cultural differences and 44 percent cited language barriers and translations issues.

It’s sometimes “easier” to avoid dealing with the presence of some 4000+ languages worldwide, but it’s not so easy to ignore when one investigates the facts in smaller “chunks” so to speak. Consider this list of “The 50 Most Widely Spoken Languages” as a more easily digestible example.

If your company aims to expand footprint and revenue generation in this “flat world,” globalization needs to be a part of the brand management discussion. And if you are responsible for leading the charge into a new geographic region — you need to have a voice that’s heard.

Respect for Complexity and Security are Winners

I participated in one search vendor’s user conference this week, and a webinar sponsored by another. Both impressed me because they expressed values that I respect in the content software industry and they provided solid evidence that they have the technology and business delivery infrastructure to back up the rhetoric.

You have probably noted that my blog is slim on comments about specific products and this trend will continue to be the norm. However, in addition to the general feeling of good will from Endeca customers that I experienced at Endeca Discover 2007, I heard clear messages from sessions I attended that reinforced the company’s focus on helping clients solve complex content retrieval problems. Complexity is inherent in enterprises because of diversity among employees, methods of operating, technologies deployed and varied approaches to meeting business demands at every level.

In presentations by Paul Sonderegger and Jason Purcell care was given to explain Endeca’s approach to building their search technology solutions and why. At the core is a fundamental truth about how organizations and people function; you never know how a huge amount of unpredictably interconnected stuff will be approached. Endeca wants its users to be able to use information as levers to discover, through personalized methods, relationships among content pieces that will pry open new possibilities for understanding the content.

Years ago I was actively involved with a database model called an associative structural model. It was developed explicitly to store and manipulate huge amount of database text and embodied features of hierarchical, networked and relational data structures.

It worked well for complex, integrated databases because it allowed users to manipulate and mingle data groups in unlimited ways. Unfortunately, it required a user to be able to visualize the possibilities for combining and expressing data from hundreds of fields in numerous tables by using keys. This structural complexity could not easily be taught or learned, and tools for simple visualization were not available in the early 1980s. As I listened to Jason Purcell describe Endeca’s optimized record store, and concept of “intra-query” to provide solutions for the problems posed by double uncertainty I thought, “They get it.” They have acknowledged the challenge of making it simple to update, use and exploit vast knowledge stores; they are working hard to meet the challenge. Good for them! We all want flexibility to work the way we want but if it is not easy we will not adopt.

In a KMWorld webinar, Vivisimo’s Jerome Pesente and customer Arnold Verstraten of NV Organon co-presented with Matt Brown of Forrester Research. The theme was search security models. Besides the reasons for up-front consideration for security when accessing and retrieving from enterprise repositories, three basic control models were described. All three were based on access control lists (ACLs), how and why they are used by Vivisimo.

Having worked with defense agencies, defense contractors and corporations with very serious security requirements on who can access what, I am very familiar with the types of data structures and indexing methods that can be used. I was pleased to hear the speakers address trade-offs that include performance and deployment issues. It served to remind me that organizations do need to be thinking about this early in the selection process; inability to handle the most sensitive content appropriately should eliminate any enterprise search vendor that tries to equivocate on security. Also, as Organon did, there is nothing that demonstrates the quality of the solution like a “bake-off” against a sufficient corpus of content that will demonstrate whether all documents and their metadata that must not be viewed by some audiences in fact are always excluded from search results for all in those restricted audiences. Test it. Really test it!

After Enterprise 2.0

The Enterprise 2.0 conference is winding down in Boston today so it’s a good time to reflect on the “two dot oh” phenomena. As industry watchers and marketers, we’ve come a long way since Tim O’Reilly coined the concept two years ago. With hordes of people crowded into the new Westin Hotel in South Boston and the exhibit hall packed like Grand Central at rush hour (I’ve yet to hear an attendance count) I felt a rush of excitement in the air. There’s a certain sense of headiness when talking with entrepreneurs about their latest products and solutions. I was floored by the breadth of creativity.
But what struck me most is the vision thing – the gulf between the claims about Enterprise 2.0 and the realities of how work gets done. Enterprise 2.0 seems to be about blogging for a living, putting up a wiki, realizing that email is broken, and communicating with customers. Oh yes, then there’s unleashing the power of teams, user generated content, and building communities. The list goes on . . .
Yet two words are missing – management and process. In our always on world, we are inundated with information, and constrained by the limits of the twenty-four hour day. We need to take a hard look at how sharing information online creates new sources of value and better modes of organization. David Weinberger said it best in his opening keynote – everything is now metadata. We need to figure out how to harness this incredible openness at our fingertips. Developing a compelling information architecture is going to be even harder than an effective technical architecture.
What comes next? I’m now planning the collaboration and social computing track for our fall conference. (Stay tuned, we’ll be announcing the program later this summer.) I think we need to take a look at the hard issues of designing collaborative business processes. What do you think? I’m open to suggestions. Let me know, beginning by responding to this post. I look forward to our continuing conversation.

Where is the “L” in Web 2.0?

I was only able to make it into the Enterprise 2.0 conference in Boston yesterday. You can still get a demo pass for today. But I was thrilled to hear analyst, researchers, case study presenters, and yes, even vendors, drill down into one of my favorite phrases: “people, processes, and technology make it possible” and hope the mantra continues today.

Point being, obviously, that 2.0 not just about technology ;-). Its about culture, filling generation gaps, the evolution of *people* networking, and redefining community from the core of where community starts. Humans.

What I didn’t hear, however, is the “L” word — specifically language, and that bothered me. We just can’t be naive enough to think that community, collaboration, and networking on a global scale is solely English-driven. We need to get the “L” word into the conversation.

My globalization practice colleague Kaija Poysti weighs in here.

More data on Facebook users and Enterprise 2.0

Here is a chart including the data from the poll described yesterday from 500 25-34 year old facebook users combined with the results from the same poll given to 500 18-24 year old facebook users. There is certainly a difference. But the most surprising results are the extremely low expectations about the use of blogs and wikis, and even social networking software. These findings, informal as they are, would make me very nervous if I were a start-up hoping to make it by capturing the facebook generation as they stream into the workforce.

Where is the “L” in Web 2.0?

I was only able to make it into the Enterprise 2.0 conference in Boston yesterday. You can still get a demo pass for today. But I was thrilled to hear analyst, researchers, case study presenters, and yes, even vendors, drill down into one of my favorite phrases: “people, processes, and technology make it possible” and hope the mantra continues today.
Point being, obviously, that 2.0 not just about technology ;-). Its about culture, filling generation gaps, the evolution of *people* networking, and redefining community from the core of where community starts. Humans.
What I didn’t hear, however, is the “L” word — specifically language, and that bothered me. We just can’t be naive enough to think that community, collaboration, and networking on a global scale is solely English-driven. We need to get the “L” word into the conversation.
My globalization practice colleague Kaija Poysti weighs in here.

Facebook Generation on Enterprise 2.0 Collaboration Technologies

I joined facebook a few days ago to check it out and to get an idea about the approach’s relevance to enterprise applications. I need to use it some more before I reach any conclusions, but since I am at the Enterprise 2.0 Collaborative Technologies conference this week I decided to use the new facebook poll feature to see what the facebook crowd thinks about collaboration as they enter the workplace. The poll feature is limited (1 multiple choice question) but it provides direct access to the tens of millions of facebook users and you can choose from a couple of demographic options. Also, you can get the results very quickly – in my first poll I received 500 responses in about 9 hours!

I will blog about the results more later and will also include all the graphs, but in the meantime, everyone I mentioned the poll to at the conference has wanted the results, so here they are the basics:

Question: Which collaboration technologies will you use the most in your job in two years?

  • SMS text messaging 6% (30)
  • email will continue to dominate 66% (328)
  • instant messaging 16% (53)
  • facebook-like social networking tools for business 11% (53)
  • blogs and/or wikis 2% (8)

Keep in mind that the 500 responses all came from the 25-34 age group who are presumably mostly in the workforce. I just started the same poll with the 18-24 age group and will provide those results for comparison later tonight. UPDATE: The combined poll results are now available.

Between the two age groups we will have some info direct from the generation that Don Tapscott, Andrew McAfee and others are making predictions about (we refer to some of this here). This is of course a very informal poll, but interesting nonetheless. I wish I had the results in time to provide Andrew and Tom Davenport for yesterday’s debate!

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑