Webinar: Corporate Marketing as a Publishing Business

October 29, 11:00 am ET

Attracting, converting, and retaining customers is the mission of every corporate marketing organization. Content is obviously central to executing the mission. The key to success, though, isn’t just delivering content on websites — it’s leveraging content to wring out its maximum value for the business and the customer.

Leading publishers have deep expertise in solving the knottiest problems associated with leveraging content. How can corporate marketers put a publisher’s knowledge and experience to work in their own domain? We discuss the issues and trends with Diane Burley, Industry  Specialist at Nstein, in a lively online conversation. Attend Everyone is a Publisher: No Matter What Industry You’re In, and gain insights into solutions that top media companies have put into practice to survive the digital economy. Topics include:

  • Engaging customers with content, and metrics to gauge performance.
  • Managing corporate marketing and brand content from multiple sources.
  • Streamlining web content workflows.
  • Creating demographic-specific microsites.

Registration is open. Sponsored by NStein. 

Coming soon: a Gilbane Beacon on publishing as every organization’s second business.

Getting started on WCM…

You may have heard that I’m the new guy in town, and I’m happy to say this is my first blog post as a member of the Gilbane Group.  I am thrilled to be a part such a well-respected organization, and I’m ready to roll up my sleeves and get to work on all things WCM!

A little about me: I’ve been a practitioner and a consultant in the WCM space for over ten years, but I’ve worked for an analyst firm for all of two days.  The good news? I know, first hand, the pains users experience when it comes to web content management.  I empathize with the marketer who knows there must be a way to put all this content to work in her next pull-through campaign, and I sympathize with the Intranet Manager who has been directed to deploy more Web 2.0 tools into the enterprise, even in the absence of a business case. [I’m not a Web 2.0-basher, by the way.] I consider myself a passionate user advocate, and if I’m true to myself (and to you) I’ll continue to bring that perspective to all of my work here at Gilbane.

To continue my let-me-tell-you-about-me schtick, here are a few random thoughts that come to mind which will hopefully provide further insight into my philosophy as it relates to WCM:

  • Usability has become a commodity; It’s time for vendors to stop bragging about it and for users to stop accepting anything less.
  • Technology for the sake of technology leads to dissatisfaction every time.
  • “What problem am I trying to solve?” — If you can’t answer this, stop what you’re doing.
  • Technology won’t change human nature…but it will amplify it!
  • You don’t have to do what everyone else is doing…there’s a good chance they’ll fail anyway.
  • “Grassroots” applications require more planning, not less.
  • User research is never a bad idea… but don’t just ask them, watch them.

And finally,

  • If we spent as much time crafting strategies as writing RFPs and selecting tools, we’d achieve a much higher ROI.

So that’s it for now. I look forward to writing more on these pages and hope you’ll chime in with your thoughts and reactions.

 

Follow me on Twitter

The SharePoint Backend- What are the Headaches – What are the benefits

As I pointed out in my first post (SharePoint: Without the Headaches – A Discussion of What is Available in the Cloud,) you don’t necessarily need to host SharePoint in your own organization.  Although I believe that most businesses should focus on leveraging the front end of SharePoint to its full extent, it is important for non-technical users to have an understanding of what it takes to host SharePoint and why one might want to do so.  Therefore, this post provides a discussion of what it takes to host SharePoint and the driving factors for hosting SharePoint.

 

Microsoft’s original intent was to build a tool that was easy to leverage by non-technical users.  Microsoft thought of this as the natural extension of Office to the web[1].  That being said, the complexities got away from Microsoft, and in order to leverage a number of features one needs access to the back end.

Before delving into the SharePoint back end, let me point out that many businesses hire SharePoint development staff, both permanent and on a consulting basis. I think that developing custom SharePoint code should be done only after thoroughly justifying the expense.  It is often a mistake.  Instead, organizations should clearly define their requirements and then leverage a high quality third party add-on.  I will mention some of these at the end of the post.

SharePoint is a fragile product and therefore custom code for SharePoint is very expensive to develop, test, and deploy. Furthermore, custom code often needs to be rewritten when migrating to the next release of SharePoint.  Finally, SharePoint is a rapidly growing product, and chances are good that custom code may soon become obsolete by new features in the next generation.

In my first post, I pointed out that inexpensive SharePoint hosting options are available in the cloud. These options tend to be limited.  For example, the inexpensive rentals do not provide much security, only provide WSS (not MOSS), and do not allow one to add third party add-ins.  It is possible to lease custom environments that don’t surrender to any of these limitations, but they come at a cost.  (Typically starting at $500 per month[2].)  I believe that robust MOSS offerings with third party add-ons will be available at competitive prices within two years. 

——————————————————————————–

[1] SharePoint is developed by the Office division.

[2] For example, FPWeb offers a SharePoint hosted environment with the CorasWorks Workplace Suite included starting at $495 per month.

Read More →

Google Wave Protocols: Clearing the Confusion

Today is the long-awaited day when 100,000 lucky individuals receive access to an early, but working, version of Google Wave. I hope I am in those ranks! Like many people, I have been reading about Wave, but have not been able to experience it hands-on

 

Wave has been a hot topic since it was first shown outside of Google last May. Yet it continues to be quite misunderstood, most likely because it is such an early stage effort and most interested people have not been able to lay hands on the technology. For that very reason, Gilbane Group is presenting a panel entitled Google Wave: Collaboration Revolution or Confusion? at the Gilbane Boston conference, on December 3rd.

The confusion surrounding Wave was highlighted for me yesterday in a Twitter exchange on the topic. It all started innocently enough, when Andy McAfee asked:

Andy1

To which I replied:

Larry1

That statement elicited the following comment from Jevon MacDonald of the Dachis Group:

Jevon1

I am not a technologist. I seek to understand technology well enough that I can explain it in layman’s terms to business people, so they understand how technology can help them achieve their business goals. So I generally avoid getting into deep technical discussions. This time, however, I was pretty sure that I was on solid ground, so the conversation between me and Jevon continued:

Larry2

Larry3

Jevon2

Now, here we are, at the promised blog post. But, how can Jevon and I both be correct? Simple. Google Wave encompasses not one, but several protocols for communication between system components, as illustrated in the figure below.

wave_protocols

Figure 1: Google Wave Protocols (Source: J. Aaron Farr,

The most discussed of these is the Google Wave Federation protocol, which is an extension of the Extensible Messaging and Presence Protocol (XMPP). However, Wave also requires protocols for client-server and robot server- (Web service) Wave server communication. It is also possible, but probably not desirable, for Wave to utilize a client-client protocol.

Jevon was absolutely correct about the XMPP protocol enabling server-server communication in the Google Wave Federation Protocol. The Draft Protocol Specification for the Google Wave Federation Protocol lays out the technical details, which I will not explore here. XMPP provides a reliable mechanism for server-server communication and is a logical choice for that function in Google Wave, because XMPP was originally designed to transmit instant message and presence data.

It turns out that the Google Wave team has not defined a specific protocol to be used in client-server communication. A Google whitepaper entitled Google Wave Data Model and Client-Server Protocol does not mention a specific protocol. The absence of a required or recommended protocol is also confirmed by this blog post. While the Google implementation of Wave does employ HTTP as the client-server protocol, as Jevon stated, it is possible to use XMPP as the basis for client-server communication, as I maintained. ProcessOne demonstrates this use of XMPP in this blog post and demo.

Finally, there is no technical reason that XMPP could not be used to route communications directly from one client to another. However, it would not be desirable to communicate between more than two clients via XMPP. Without a server somewhere in the implementation, Wave would be unable to coordinate message state between multiple clients. In plain English, the Wave clients most likely would not be synchronized, so each would display a different point in the conversation encapsulated in the Wave.

To summarize, Google Wave employs the following protocols:

  • XMPP for server-server communication
  • HTTP for client-server communication in the current Google implementation; XMPP is possible, as demonstrated by ProcessOne
  • HTTP (JSON RPC) for robot server-Wave server communication in the current Google implementation
  • Client-client protocol is not defined, as this mode of communication is most likely not usable in a Wave

I hope this post clarifies the protocols used in the current architecture of Google Wave for you. More importantly, I hope that it highlights just how much additional architectural definition needs to take place before Wave is ready for use by the masses. If I had a second chance to address Andy McAfee’s question, I would unequivocally state that Google Wave is a "concept car" at this point in time.

Postscript: The heretofore mentioned possibilities around XMPP as a client-client protocol are truly revolutionary.
The use of XMPP as the primary communication protocol for the Internet, instead of the currently used HTTP protocol, would create a next generation Internet in which centralized servers would no longer serve as intermediaries between users. Web application architectures, even business models, would be changed. See this post for a more detailed explanation of this vision, which requires each user to run a personal server on their computing device.

Delivering a Global Customer Experience: An Interview with Jonckers Translation & Engineering

Third in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Kelli Kohout, global marketing manager for Jonckers Translation & Engineering.  Jonckers is a global provider of localization, translation, and multilingual testing services, with operations across the U.S., Europe, and Asia. Kelli talked with us about Jonckers’ role in the global content value chain, why they supported the research, and what she found compelling about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Kohout: Ultimately, Jonckers is helping clients develop content that earns understanding, adoption and loyalty from global customers.

Sometimes clients come to us with original content that will not localize well – in other words, that is not easy to turn into localized versions that achieve the desired response from audiences.  We provide best practices for improving the quality of their source content, asking additional questions regarding their organizations’ goals for their global clients, in order to improve the success of global adoption.  In doing so, we prove Jonckers’ philosophy that resulting translations can even improve on the source (in-country translators with longevity, institutional knowledge, up-to-date cultural knowledge, commitment).  We also help clients save time and money by delivering content that is flexible enough to be used for more than one purpose.

Gilbane: Why did you choose to sponsor the Gilbane research? 

Kohout: Our clients no longer compete solely on the basis of a better product or service – it’s about customer experience.  And in today’s economic environment, our clients are struggling with how to generate revenue by increasing innovation and global reach, which means increasing the amount and accessibility of multilingual content.  Simultaneously, they need to decrease expenses, like the costs associated with providing customer service.

This all points to the increasing need to localize effectively and efficiently.  Jonckers sponsored this study for the common good – the more we share trends, best practices and lessons learned, and the more we know what challenges our clients are facing, the more effective and valued localization services will be.

We also hope this study will raise awareness of some important localization best practices that will make companies more successful.  For instance, we see clients beginning to realize the importance of involving localization planning early in the product development lifecycle, but there’s still room for improvement there.  When localization is an afterthought, the outcome is not as good, there are extra costs, and bigger picture timelines can be adversely affected.

Similarly, more clients are recognizing the value of integrating the localization effort more closely with other functions.  As the study points out, there are more cross-functional champions within organizations who understand the big picture and have the mindshare with executives.  These champions can advocate for the needs of the localization function and help demonstrate its value.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Kohout: We’re seeing an increase in our clients’ global business objectives, but the study confirms that – on the whole – we’re still in the early stages of understanding the global content value chain.  For example, one of the top corporate objectives related to localization is customer satisfaction, which is important, but few are fully utilizing localization to manage their brand globally.  So there’s still room to evolve.  In addition, there’s a focus on generating revenues from emerging markets, but very few have yet tapped the potential from established geographies.

For insights into customer experience as a new basis for competitive advantage, see “Content Utility as the Value Proposition” on page 15 of the report.  You can also learn how Jonckers contributed to Adobe’s effort to build a globalization infrastructure that improves customer satisfaction, raises quality, and saves costs.  Download the study for free.

Gilbane Group Appoints Bill Trippe VP, Content Strategies

For Immediate Release:

Cambridge MA, September 29, 2009. Gilbane Group Inc. today announced that Bill Trippe has been promoted to Vice President & Lead Analyst, Content Strategies. In his new role at Gilbane Group, Bill will be a core part of the management team, and will be focused on continuing to grow Gilbane Group’s strategy consulting and advisory business.

Trippe was previously Lead Analyst for Gilbane’s XML Technologies and Content Strategies Consulting Practice, where he led efforts helping businesses, publishers and government agencies build successful strategies especially for large and complex content management and publishing requirements. His new role reflects his success and the need to grow the management team to accommodate the growth in consulting business.

"Bill and I have worked together in variety of capacities for many years, and I’m thrilled that we’ll be working together even more closely." said Frank Gilbane, CEO of the Gilbane Group. "Bill’s expertise and experience combined with his strong interpersonal skills keep him in high demand from both customers and colleagues".

"Clearly articulated content strategies are essential to getting business case funding in today’s economic climate," comments Mary Laplante, VP Client Services, Programs and Consulting. "Bill’s new role is a response to growing demand by users and buyers for help with developing sustainable content strategies that deliver measurable value."

"I am excited to be taking on this new role at Gilbane. The content management landscape continues to be dynamic and compelling, and I look forward to helping our clients leverage technology for productivity, new product development, and overall growth and success." said Bill Trippe, VP & Lead Analyst, Content Strategies.

Tweet this: Gilbane Group Appoints Bill Trippe VP, Content Strategies http://bit.ly/1Ju6mM #gilbane

About Gilbane Group, Inc.
Gilbane Group Inc. is an analyst and consulting firm that has been writing and consulting about the strategic use of information technologies since 1987. We have helped organizations of all sizes from a wide variety of industries and governments. We work with the entire community of stakeholders including investors, enterprise buyers of IT, technology suppliers, and other consultant and analyst firms. We have organized over 60 educational conferences in North America and Europe. Our next event is Gilbane Boston, December 1-3, 2009 http://gilbaneboston.com/. Information about our widely read newsletter, reports, white papers, case studies and analyst blogs is available at http://gilbane.com.

Follow Gilbane Group on Twitter, or Facebook.

Contact:
Gilbane Group, Inc.
Ralph Marto, 617-497-9443 ext 117
ralph@gilbane.com

New Best Practice Profiles published

See our new Best Practice Profile Series. They are free to download. The first three profiles include:

Unifying the Global Content Value Chain: An Interview with Lasselle Ramsay

Second in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Joan Lasselle, President of Lasselle Ramsay. Lasselle Ramsay is a service provider that designs solutions for content and learning that align how users work with the information needed to achieve business results. We talked with Joan about her company, why they supported the research, and what surprised her about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Lasselle: Lasselle Ramsay is a professional service provider, not a reseller or technology integrator. We focus on helping companies develop new product content. Our work spans the value chain, ranging from engineering (at the point of origin), to technical marketing and technical documentation, to learning organizations and support teams. We also look at the extended value chain, which includes partners, suppliers (like translation service providers), and customers.

We encourage our clients to operate in both the strategic and tactical domains, providing them with a strategic vision, and helping implement an infrastructure that can deliver structured and unstructured multilingual content.

Gilbane: Why did you choose to sponsor the Gilbane research?

Lasselle: One of our goals as a service provider is to add value at each stage across the chain. This research study enables us to discover and share the experience and perspective of industry leaders with Lasselle Ramsay clients. We chose this particular study because of the in-depth research, as well as Gilbane’s domain expertise and independence.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Lasselle: Gilbane’s report sheds light on two key issues that our clients face: the need to address content within the context of larger business trends [referred to as megatrends in the study], and the importance of process improvements. First, companies today are challenged repeatedly to address adverse economic pressures at the same time they respond to the megatrends, such as the evolving basis of competitive advantage. The report makes clear that companies must take measures to address these megatrends in their content practices, or risk being left behind. Even in the face of negative economics and an endless and escalating flood of new data, they cannot sit back and wait. Second, the report illustrates how organizations can benefit from improving cross-functional processes. In many companies, for example, engineering and tech pubs each have their own authoring, content management, translation, and publishing, and neither group shares any processes or tools. What a lost opportunity! Just think of how much they could lower costs and speed time to market if they coordinated processes and collaborated on process improvements.

For insights into the megatrends that are shaping content globalization practices, see “Market Context” on page 9 of the report. You can also read about how Lasselle Ramsay contributed to global content value chain development at Hewlett-Packard. Download the study for free.