Curated for content, computing, and digital experience professionals

Month: September 2009 (Page 1 of 6)

The SharePoint Backend- What are the Headaches – What are the benefits

As I pointed out in my first post (SharePoint: Without the Headaches – A Discussion of What is Available in the Cloud,) you don’t necessarily need to host SharePoint in your own organization.  Although I believe that most businesses should focus on leveraging the front end of SharePoint to its full extent, it is important for non-technical users to have an understanding of what it takes to host SharePoint and why one might want to do so.  Therefore, this post provides a discussion of what it takes to host SharePoint and the driving factors for hosting SharePoint.

 

Microsoft’s original intent was to build a tool that was easy to leverage by non-technical users.  Microsoft thought of this as the natural extension of Office to the web[1].  That being said, the complexities got away from Microsoft, and in order to leverage a number of features one needs access to the back end.

Before delving into the SharePoint back end, let me point out that many businesses hire SharePoint development staff, both permanent and on a consulting basis. I think that developing custom SharePoint code should be done only after thoroughly justifying the expense.  It is often a mistake.  Instead, organizations should clearly define their requirements and then leverage a high quality third party add-on.  I will mention some of these at the end of the post.

SharePoint is a fragile product and therefore custom code for SharePoint is very expensive to develop, test, and deploy. Furthermore, custom code often needs to be rewritten when migrating to the next release of SharePoint.  Finally, SharePoint is a rapidly growing product, and chances are good that custom code may soon become obsolete by new features in the next generation.

In my first post, I pointed out that inexpensive SharePoint hosting options are available in the cloud. These options tend to be limited.  For example, the inexpensive rentals do not provide much security, only provide WSS (not MOSS), and do not allow one to add third party add-ins.  It is possible to lease custom environments that don’t surrender to any of these limitations, but they come at a cost.  (Typically starting at $500 per month[2].)  I believe that robust MOSS offerings with third party add-ons will be available at competitive prices within two years. 

——————————————————————————–

[1] SharePoint is developed by the Office division.

[2] For example, FPWeb offers a SharePoint hosted environment with the CorasWorks Workplace Suite included starting at $495 per month.

Continue reading

Google Wave Protocols: Clearing the Confusion

Today is the long-awaited day when 100,000 lucky individuals receive access to an early, but working, version of Google Wave. I hope I am in those ranks! Like many people, I have been reading about Wave, but have not been able to experience it hands-on

Wave has been a hot topic since it was first shown outside of Google last May. Yet it continues to be quite misunderstood, most likely because it is such an early stage effort and most interested people have not been able to lay hands on the technology. For that very reason, Gilbane Group is presenting a panel entitled Google Wave: Collaboration Revolution or Confusion? at the Gilbane Boston conference, on December 3rd.

The confusion surrounding Wave was highlighted for me yesterday in a Twitter exchange on the topic. It all started innocently enough, when Andy McAfee asked:

Andy1

To which I replied:

Larry1

That statement elicited the following comment from Jevon MacDonald of the Dachis Group:

Jevon1

I am not a technologist. I seek to understand technology well enough that I can explain it in layman’s terms to business people, so they understand how technology can help them achieve their business goals. So I generally avoid getting into deep technical discussions. This time, however, I was pretty sure that I was on solid ground, so the conversation between me and Jevon continued:

Larry2

Larry3

Jevon2

Now, here we are, at the promised blog post. But, how can Jevon and I both be correct? Simple. Google Wave encompasses not one, but several protocols for communication between system components, as illustrated in the figure below.

wave_protocols

Figure 1: Google Wave Protocols (Source: J. Aaron Farr,

The most discussed of these is the Google Wave Federation protocol, which is an extension of the Extensible Messaging and Presence Protocol (XMPP). However, Wave also requires protocols for client-server and robot server- (Web service) Wave server communication. It is also possible, but probably not desirable, for Wave to utilize a client-client protocol.

Jevon was absolutely correct about the XMPP protocol enabling server-server communication in the Google Wave Federation Protocol. The Draft Protocol Specification for the Google Wave Federation Protocol lays out the technical details, which I will not explore here. XMPP provides a reliable mechanism for server-server communication and is a logical choice for that function in Google Wave, because XMPP was originally designed to transmit instant message and presence data.

It turns out that the Google Wave team has not defined a specific protocol to be used in client-server communication. A Google whitepaper entitled Google Wave Data Model and Client-Server Protocol does not mention a specific protocol. The absence of a required or recommended protocol is also confirmed by this blog post. While the Google implementation of Wave does employ HTTP as the client-server protocol, as Jevon stated, it is possible to use XMPP as the basis for client-server communication, as I maintained. ProcessOne demonstrates this use of XMPP in this blog post and demo.

Finally, there is no technical reason that XMPP could not be used to route communications directly from one client to another. However, it would not be desirable to communicate between more than two clients via XMPP. Without a server somewhere in the implementation, Wave would be unable to coordinate message state between multiple clients. In plain English, the Wave clients most likely would not be synchronized, so each would display a different point in the conversation encapsulated in the Wave.

To summarize, Google Wave employs the following protocols:

  • XMPP for server-server communication
  • HTTP for client-server communication in the current Google implementation; XMPP is possible, as demonstrated by ProcessOne
  • HTTP (JSON RPC) for robot server-Wave server communication in the current Google implementation
  • Client-client protocol is not defined, as this mode of communication is most likely not usable in a Wave

I hope this post clarifies the protocols used in the current architecture of Google Wave for you. More importantly, I hope that it highlights just how much additional architectural definition needs to take place before Wave is ready for use by the masses. If I had a second chance to address Andy McAfee’s question, I would unequivocally state that Google Wave is a “concept car” at this point in time.

Postscript: The heretofore mentioned possibilities around XMPP as a client-client protocol are truly revolutionary.
The use of XMPP as the primary communication protocol for the Internet, instead of the currently used HTTP protocol, would create a next generation Internet in which centralized servers would no longer serve as intermediaries between users. Web application architectures, even business models, would be changed. See this post for a more detailed explanation of this vision, which requires each user to run a personal server on their computing device.

Competition among Search Vendors

Is there any real competition when it comes to enterprise search? Articles like this one in ComputerWorld make good points but also foster the idea that this could be a differentiator for buyers: Yahoo deal puts IBM, Microsoft in enterprise search pickle, by Juan Carlos Perez, August 4, 2009.

I wrote about the IBM launch of the OmniFind suite of search products a couple of years ago with positive comments. The reality ended up being quite different as I noted later. Among the negatives were three that stand out in my mind. First, free (as in the IBM OmniFind Yahoo no-charge edition) is rarely attractive to serious enterprises looking for a well-supported product. Second, the substantial computing overhead for the free product was significant enough that some SMBs I know of were turned off; the costs associated with the hardware and support it would require offset “free.” Third, my understanding that the search architecture for the free product would provide seamless upgrades to IBM’s other OmniFind products was wrong. Each subsequent product adoption would require the same “rip and replace” that Steve Arnold describes in his report, Beyond Search. It is hard to believe that IBM got much traction out of this offering from the enterprise search market at large. Does anyone know if there was really any head-to-head competition between IBM and other search vendors over this product?

On the other hand, does the Microsoft Express Search offering appeal to enterprises other than the traditional Microsoft shop? If Microsoft Express Search went away, it would probably be replaced by some other Microsoft search variation with inconvenience to the customer who needs to rip and replace and left on his own to grumble and gripe. What else is new? The same thing would happen with IBM Yahoo OmniFind users and they would adapt.

I’ve noticed that free and cheap products may become heavily entrenched in the marketplace but not among organizations likely to upgrade any time soon. Once enterprises get immersed in a complex implementation (and search done well does require that) they won’t budge for a long, long time, even if the solution is less than optimal. By the time they are compelled to upgrade they are usually so wedded to their vendor that they will accept any reasonable offer to upgrade that the vendor offers. Seeking competitive options is really difficult for most enterprises to pursue without an overwhelmingly compelling reason.

This additional news item indicates that Microsoft is still trying to get their search strategy straightened out with another new acquisition, Applied Discovery Selects Microsoft FAST for Advanced E-Discovery Document Search. E-discovery is a hot market in legal, life sciences and financial verticals but firms like ISYS, Recommind, Temis, and ZyLab are already doing well in that arena. It will take a lot of effort to displace those leaders, even if Microsoft is the contender. Enterprises are looking for point solutions to business problems, not just large vendors with a boatload of poorly differentiated products. There is plenty of opportunity for specialized vendors without going toe-to-toe with the big folks.

Delivering a Global Customer Experience: An Interview with Jonckers Translation & Engineering

Third in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Kelli Kohout, global marketing manager for Jonckers Translation & Engineering.  Jonckers is a global provider of localization, translation, and multilingual testing services, with operations across the U.S., Europe, and Asia. Kelli talked with us about Jonckers’ role in the global content value chain, why they supported the research, and what she found compelling about the results.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?) 

Kohout: Ultimately, Jonckers is helping clients develop content that earns understanding, adoption and loyalty from global customers.

Sometimes clients come to us with original content that will not localize well – in other words, that is not easy to turn into localized versions that achieve the desired response from audiences.  We provide best practices for improving the quality of their source content, asking additional questions regarding their organizations’ goals for their global clients, in order to improve the success of global adoption.  In doing so, we prove Jonckers’ philosophy that resulting translations can even improve on the source (in-country translators with longevity, institutional knowledge, up-to-date cultural knowledge, commitment).  We also help clients save time and money by delivering content that is flexible enough to be used for more than one purpose.

Gilbane: Why did you choose to sponsor the Gilbane research? 

Kohout: Our clients no longer compete solely on the basis of a better product or service – it’s about customer experience.  And in today’s economic environment, our clients are struggling with how to generate revenue by increasing innovation and global reach, which means increasing the amount and accessibility of multilingual content.  Simultaneously, they need to decrease expenses, like the costs associated with providing customer service.

This all points to the increasing need to localize effectively and efficiently.  Jonckers sponsored this study for the common good – the more we share trends, best practices and lessons learned, and the more we know what challenges our clients are facing, the more effective and valued localization services will be.

We also hope this study will raise awareness of some important localization best practices that will make companies more successful.  For instance, we see clients beginning to realize the importance of involving localization planning early in the product development lifecycle, but there’s still room for improvement there.  When localization is an afterthought, the outcome is not as good, there are extra costs, and bigger picture timelines can be adversely affected.

Similarly, more clients are recognizing the value of integrating the localization effort more closely with other functions.  As the study points out, there are more cross-functional champions within organizations who understand the big picture and have the mindshare with executives.  These champions can advocate for the needs of the localization function and help demonstrate its value.

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Kohout: We’re seeing an increase in our clients’ global business objectives, but the study confirms that – on the whole – we’re still in the early stages of understanding the global content value chain.  For example, one of the top corporate objectives related to localization is customer satisfaction, which is important, but few are fully utilizing localization to manage their brand globally.  So there’s still room to evolve.  In addition, there’s a focus on generating revenues from emerging markets, but very few have yet tapped the potential from established geographies.

For insights into customer experience as a new basis for competitive advantage, see “Content Utility as the Value Proposition” on page 15 of the report.  You can also learn how Jonckers contributed to Adobe’s effort to build a globalization infrastructure that improves customer satisfaction, raises quality, and saves costs.  Download the study for free.

Gilbane Group Appoints Bill Trippe VP, Content Strategies

For Immediate Release:

Cambridge MA, September 29, 2009. Gilbane Group Inc. today announced that Bill Trippe has been promoted to Vice President & Lead Analyst, Content Strategies. In his new role at Gilbane Group, Bill will be a core part of the management team, and will be focused on continuing to grow Gilbane Group’s strategy consulting and advisory business.

Trippe was previously Lead Analyst for Gilbane’s XML Technologies and Content Strategies Consulting Practice, where he led efforts helping businesses, publishers and government agencies build successful strategies especially for large and complex content management and publishing requirements. His new role reflects his success and the need to grow the management team to accommodate the growth in consulting business.

“Bill and I have worked together in variety of capacities for many years, and I’m thrilled that we’ll be working together even more closely.” said Frank Gilbane, CEO of the Gilbane Group. “Bill’s expertise and experience combined with his strong interpersonal skills keep him in high demand from both customers and colleagues”.

“Clearly articulated content strategies are essential to getting business case funding in today’s economic climate,” comments Mary Laplante, VP Client Services, Programs and Consulting. “Bill’s new role is a response to growing demand by users and buyers for help with developing sustainable content strategies that deliver measurable value.”

“I am excited to be taking on this new role at Gilbane. The content management landscape continues to be dynamic and compelling, and I look forward to helping our clients leverage technology for productivity, new product development, and overall growth and success.” said Bill Trippe, VP & Lead Analyst, Content Strategies.

Tweet this: Gilbane Group Appoints Bill Trippe VP, Content Strategies http://bit.ly/1Ju6mM #gilbane

About Gilbane Group, Inc.
Gilbane Group Inc. is an analyst and consulting firm that has been writing and consulting about the strategic use of information technologies since 1987. We have helped organizations of all sizes from a wide variety of industries and governments. We work with the entire community of stakeholders including investors, enterprise buyers of IT, technology suppliers, and other consultant and analyst firms. We have organized over 60 educational conferences in North America and Europe. Our next event is Gilbane Boston, December 1-3, 2009 http://gilbaneboston.com/. Information about our widely read newsletter, reports, white papers, case studies and analyst blogs is available at https://gilbane.com.

Follow Gilbane Group on Twitter, or Facebook.

Contact:
Gilbane Group, Inc.
Ralph Marto, 617-497-9443 ext 117
ralph@gilbane.com

« Older posts

© 2020 The Gilbane Advisor

Theme by Anders NorenUp ↑