Curated for content, computing, and digital experience professionals

Year: 2009 (Page 8 of 39)

Cloud Computing: The Recent Sidekick/Microsoft Loss of Data Was Inevitable, But a Good Thing For Cloud Computing

So Microsoft was asleep at the wheel and didn’t use good procedures to backup and restore Sidekick data[1][2]. It was just a matter of time until we saw a breakdown in cloud computing.  Is this the end to cloud computing?  Not at all!  I think it is just the beginning.  Are we going to see other failures? Absolutely!  These failures are good, because they help sensitize potential consumers of cloud computing on what can go wrong and  what contractual obligations service providers must adhere to.

There is so much impetus for having centralized computing, that I think all the risk and downside will be outweighed by the positives.  On the positive side, security, operational excellence, and lower costs will eventually become mainstream in centralized services.   Consumers and corporations will become tired of the inconvenience and high cost of maintaining their own computing facilities in the last mile.

Willie Sutton, a notorious bank robber,  is often misquoted as saying that he robbed banks "because that’s where the money is."[3]   Yet all of us still keep our money with banks of one sort or another. Even though online fraud statistics are sharply increasing [4][5], the trend to use online and mobile banking as well as credit/debit transactions is on a steep ascent. Many banking experts suggest that this trend is due to convenience.

Whether a corporation is maintaining their own application servers and desktops, or consumers are caring and feeding for their MAC’s and PC’s the cost of doing this, measured in time and money is steadily growing. The expertise that is required is ever increasing.   Furthermore, the likelihood of having a security breach when individuals care for their own security is high.

The pundits of cloud computing say that the likelihood of breakdowns in highly concentrated environments such as Cloud computing servers is high.  The three main factors they point to are:

  1. Security Breaches
  2. Lack of Redundancy
  3. Vulnerability to Network Outages

I believe that in spite of these, seemingly large obstacles, we will see a huge increase in the number of cloud services and the number of people using these services in the next 5 years.  When we keep data on our local hard drives, the security risks are huge.  We are already pretty much dysfunctional when the network goes down, and I have had plenty of occasions where my system administrator had to reinstall a server or I had to reinstall my desktop applications.  After all, we all trust the phone company to give us a dial tone.

The savings that can be attained are huge:   A Cloud Computing provider can realize large savings by using specialized resources that are amortized across millions of users. 

There is little doubt in my mind that cloud computing will become ubiquitous.  The jury is still out as to what companies will become the service providers.  However, I don’t think Microsoft will be one of them, because their culture just doesn’t allow for solid commitments to the end user. 

—————————————-

[1] The Beauty in Redundancy, http://gadgetwise.blogs.nytimes.com/2009/10/12/the-beauty-in-redundancy/?scp=2&sq=sidekick&st=cse 

[2] Microsoft Project Pink – The reason for sidekick data loss, http://dkgadget.com/microsoft-project-pink-the-reason-for-sidekick-data-loss/

[3] Willie Sutton, http://en.wikipedia.org/wiki/Willie_Sutton.

[4] Online Banking Fraud Soars in Britain,  http://www.businessweek.com/globalbiz/content/oct2009/gb2009108_505426.htm?campaign_id=rss_eu

[5] RSA Online Fraud Report, September 2009,  http://www.rsa.com/solutions/consumer_authentication/intelreport/10428_Online_Fraud_report_0909.pdf

Top 20 List: World’s Largest Publishers

The list of the 20 largest publishers in the world shows a profoundly changing landscape in book publishing. The chart below is provided by Rüdiger Wischenbart from Publishing Perspectives in Germany. He has contributed some good insights into the transformation of the publishing industry. I offer my analysis on the state of the industry and its future.

 

Some publishers are fairing much better economically, while others are steadily sliding downward in revenue and in their global standing. The changing dynamics between the professional information, education and trade sectors has affected this year’s ranking.  The good news is that publishers that have reinvented themselves (responded to market demand by listening to the customer) have done much better than most.

Pearson, Thomson Reuters, Cengage are identified as star performers on the list. Four out of five dollars is generated through the digital integrated value chain. The digital content and e-book industry for professional information content is the high growth segment of the publishing industry. As an industry, we are weak in our recognition of the current size and opportunity of the digital marketplace. Education publishers and trade publishers are having trouble evolving. There is broad need for knowledgeable skilled digital workers, experienced strategic thinkers, scalable and flexible technology infrastructure, and streamlined workflow/processes that allow publishers to execute on updated strategic initiatives.

Asian publishers are becoming a force, as they are in many other market segments. They include companies like Korea’s Kyowon and China’s Higher Education Press. Their strong suit is “localizing” content (i.e. cultural adaptation), and the power and economics of a huge growing audience. They are hungry. They want their piece of the pie.

Trade publishers, experiencing a steady decline in revenues, are poorly positioned to compete. However, the strong performance of Penguin and Hachette are current exceptions in this segment. It remains to be seen if trade publishers can transform into a sustainable business model. Trade’s poor performance and outlook is due to several reasons, beginning with the fact that they have the farthest to go to find and serve today’s and tomorrow’s readers.

We have seen endless debate in trade on digital pricing and searches for new business models. The best solutions will leverage and be respectful of the stakeholders…all of them! That includes, but is not limited to; authors, agents publishers, libraries, distributors, wholesalers, physical bookstores, digital bookstores, printers, service providers, the media, reviewers, technology companies, etc. If publishers burry their heads in the sand by refusing to experiment with new content, pricing models, and sales channels, then there will be serious trouble.

On the bright side, if publishers aggressively discuss new ways to sell content with their channel partners, and seek out non-traditional channel partners that have the audiences with the demand for their products, there is the potential, not to just maintain current revenue, but to actually grow the size of the pie. I know that is a radical statement to make, yet the ‘book’ is being redefined, and publishing is becoming something new.

Several key findings:

  • The majority of Top Ranked Global Publishers are based in Europe.
  • Professional/knowledge, STM publishers have course corrected and are doing well.
  • The first major Asian Publishers are positioning towards competing as top global players.
  • Education sector is unstable.
  • Trade Publishers are, and will be, hit the hardest in the rapidly emerging digital marketplace.
  • Publishers that have reinvented themselves…are prospering!

I have high hopes for the publishing industry. However, until we can meet Peter Drucker’s market-centric definition where he says “…the aim of marketing is to make selling superfluous. The aim of marketing is to know and understand the customer so well that the product or service fits him and sells itself.” Are we there yet? When we achieve this value statement the industry will once again be healthy. As for me…being part of the solution? I am passionate about helping our clients build a stronger publishing industry that is focused on improving the reading experience.

What are you thinking now?

*The “Global Ranking of the Publishing Industry” is an annual initiative of Livres Hebdo, Paris, researched by Ruediger Wischenbart Content and Consulting, and co-published with buchreport (Germany), The Bookseller (UK) and Publishers Weekly (US).

Follow Ted Treanor on Twitter: twitter.com/ePubDr

Remedies for Language Afterthought Syndrome: Lessons from Best Practices Profiles

Providing education on the business value of global information through our research is an important part of our content globalization practice. As we know however, the value of research is only as good as the results organizations achieve when they apply it! What really gets us jazzed is when knowledge sharing validates our thinking about what we call “universal truths” – the factors that define success for those who champion, implement and sustain organizational investment in multilingual communications.

Participants in our 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains told us that eliminating the language afterthought syndrome in their companies– a pattern of treating language requirements as secondary considerations within content strategies and solutions — would be a “defining moment” in realizing the impact of their efforts. Of course, we wanted more specifics. What would those defining moments look like? What would be the themes that characterized them? What would make up the “universal truths” about the remedies? Aggregating the answers to these questions led us to develop some key and common ingredients for success:

  • Promotion of “global thinking” within their own departments, across product content domains, and between headquartered and regional resources.
  • Strategies that balance inward-facing operational efficiency and cost reduction goals with outward-facing customer impacts.
  • Business cases and objectives carefully aligned with corporate objectives, creating more value in product content deliverables and more influence for product content teams.
  • Commitment to quality at the source, language requirements as part of status-quo information design, and global customer experience as the “end goal.”
  • Focused and steady progress on removing collaboration barriers within their own departments and across product content domains, effectively creating a product content ecosystem that will grow over time.
  • Technology implementations that enable standardization, automation, and interoperability.

Defining the ingredients naturally turned into sharing the recipes, a.k.a. a series of best practices profiles based on the experiences of individual technical documentation, training, localization/translation, or customer support professionals. Sincere appreciation goes to companies including Adobe, BMW Motorrad, Cisco, Hewlett Packard, Mercury Marine, Microsoft, and the New York City Department of Education, for enabling their product content champions to share their stories. Applause goes to the champions themselves, who continue to achieve ongoing and impressive results.

Want the details?
Download the Multilingual Product Content report
(updated with additional profiles!)

Attending Localization World, Silicon Valley?
Don’t miss Mary’s presentation on
Overcoming the Language Afterthought Syndrome
in the Global Business Best Practices track.

Integrated Solutions for the Global Content Value Chain: An Interview with STAR Group

Fourth in a series of interviews with sponsors of Gilbane’s 2009 study on Multilingual Product Content: Transforming Traditional Practices into Global Content Value Chains.

We spoke with Karl Darr, an independent consultant working with STAR Group.  STAR Group is a leader in information management, localization, internationalization, and globalization solutions that address the entire lifecycle of technical communications. Karl talked with us about the importance of addressing the global content value chain (GCVC) in a comprehensive way, STAR Group’s role in delivering such solutions, and what he found compelling about the research.

Gilbane: How does your company support the value chain for global product content? (i.e., what does your company do?)

Darr: STAR Group’s mission has been to enable companies to build a single product that they can sell, ship and support anywhere in the world, along with all of the appropriate technical and end-user support literature in the native tongue for any target market. In every case, we find that the customer’s satisfaction and their perception of a quality purchase are directly related to understanding their new product in their native language. 

Early on, STAR understood that a comprehensive, integrated solution could increase efficiency, while improving data quality and consistency.  So, rather than acquire and integrate third party solutions that were not designed to work together, STAR Group developed a seamlessly integrated, end-to-end solution suite that included tools to accelerate SGML/XML authoring productivity with increased quality, integrated with Terminology Management, workflow, content management, Translation Memory, and publishing – all subject to monitoring and leaving a complete audit trail. 

All of STAR’s technologies can be purchased as stand-alone products. They integrate and interoperate very well with other vendors’ products to provide a complete solution in mixed technology environments.  However, as you might expect, STAR’s complete suite affords uncommon degrees of added efficiency, accuracy, quality and operational cost reductions.

Gilbane: Why did you choose to sponsor the Gilbane research?

Darr: STAR Group co-sponsored this research because the GCVC concept speaks directly to the sweet spot on which STAR has focused for 25 years. STAR Group has provided technologies and services to support every step along the GCVC, from information engineering, creation, and cross-functional synchronization to translation, localization, management, and static and dynamic publication along with dialog management and reporting. 

Gilbane: What, in your opinion, is the most relevant/compelling/interesting result reported in the study?

Darr: The most relevant/compelling/interesting result reported in the study is that 70% of respondents claimed that the process of integrating their GCVC technologies was difficult at best.  What is even more surprising is that, according to the research, only 20% of respondents claimed they had API-level integration between their translation management and CMS tools.

In other words, respondents are suffering from the fact that the people responsible for globalization efforts are dealing with limited vision, scope and fragmented tool sets.  This causes ambiguities, duplications and errors that unnecessarily waste time, energy, resources and corporate profitability – while damaging product and corporate images, and at the same time weakening customer affiliations with the company.

I believe that this situation can only happen when top corporate management is more focused on getting product out the door than they are on optimizing the customer experience, which is critical to increasing profits.  When customer experience is a top priority, these companies will recognize that globalization (or the GCVC) is a manufacturing process in its own right that needs to be prioritized right along with design, engineering, production and customer support. The GCVC is not a ‘bolt-on’ solution because it needs to be intimately involved in all of these processes. As such, GCVC efforts need to start as soon as the product planning process begins, be fully engaged as customer specifications become requirements, and continue in a collaborative manner throughout the process of a project becoming a product.  But, they don’t end there either.  Ongoing multilingual product support is critical for delivering an optimal customer experience, one that results in repeat or recurring business.  Because all GCVC solutions will require ongoing maintenance and support, end-user companies need to ensure that whoever is providing support can cover the full spectrum of GVCV functions. 

Often, our discussions with companies have only begun when organizations understand the depth and breadth of the GCVC. In some cases, they end up relying on us for nearly everything – from their technical writing to translation, workflow, content management and publishing, to spare parts order management with optimized diagnostics delivery and dialog management.  Many of these organizations – some among the most successful global companies – have relegated the notion of a “document” to be an artifact of a by-gone era. 

For insights into technology integration across the GCVC, see the section on “Content Management Integration” that begins on page 32 of the report. You can also learn how STAR Group helped BMW Motorrad implement an end-to-end infrastructure for global technical communication. Download the study for free.

Webinar: Corporate Marketing as a Publishing Business

October 29, 11:00 am ET

Attracting, converting, and retaining customers is the mission of every corporate marketing organization. Content is obviously central to executing the mission. The key to success, though, isn’t just delivering content on websites — it’s leveraging content to wring out its maximum value for the business and the customer.

Leading publishers have deep expertise in solving the knottiest problems associated with leveraging content. How can corporate marketers put a publisher’s knowledge and experience to work in their own domain? We discuss the issues and trends with Diane Burley, Industry  Specialist at Nstein, in a lively online conversation. Attend Everyone is a Publisher: No Matter What Industry You’re In, and gain insights into solutions that top media companies have put into practice to survive the digital economy. Topics include:

  • Engaging customers with content, and metrics to gauge performance.
  • Managing corporate marketing and brand content from multiple sources.
  • Streamlining web content workflows.
  • Creating demographic-specific microsites.

Registration is open. Sponsored by NStein. 

Coming soon: a Gilbane Beacon on publishing as every organization’s second business.

Getting started on WCM…

You may have heard that I’m the new guy in town, and I’m happy to say this is my first blog post as a member of the Gilbane Group.  I am thrilled to be a part such a well-respected organization, and I’m ready to roll up my sleeves and get to work on all things WCM!

A little about me: I’ve been a practitioner and a consultant in the WCM space for over ten years, but I’ve worked for an analyst firm for all of two days.  The good news? I know, first hand, the pains users experience when it comes to web content management.  I empathize with the marketer who knows there must be a way to put all this content to work in her next pull-through campaign, and I sympathize with the Intranet Manager who has been directed to deploy more Web 2.0 tools into the enterprise, even in the absence of a business case. [I’m not a Web 2.0-basher, by the way.] I consider myself a passionate user advocate, and if I’m true to myself (and to you) I’ll continue to bring that perspective to all of my work here at Gilbane.

To continue my let-me-tell-you-about-me schtick, here are a few random thoughts that come to mind which will hopefully provide further insight into my philosophy as it relates to WCM:

  • Usability has become a commodity; It’s time for vendors to stop bragging about it and for users to stop accepting anything less.
  • Technology for the sake of technology leads to dissatisfaction every time.
  • “What problem am I trying to solve?” — If you can’t answer this, stop what you’re doing.
  • Technology won’t change human nature…but it will amplify it!
  • You don’t have to do what everyone else is doing…there’s a good chance they’ll fail anyway.
  • “Grassroots” applications require more planning, not less.
  • User research is never a bad idea… but don’t just ask them, watch them.

And finally,

  • If we spent as much time crafting strategies as writing RFPs and selecting tools, we’d achieve a much higher ROI.

So that’s it for now. I look forward to writing more on these pages and hope you’ll chime in with your thoughts and reactions.

 

Follow me on Twitter

The SharePoint Backend- What are the Headaches – What are the benefits

As I pointed out in my first post (SharePoint: Without the Headaches – A Discussion of What is Available in the Cloud,) you don’t necessarily need to host SharePoint in your own organization.  Although I believe that most businesses should focus on leveraging the front end of SharePoint to its full extent, it is important for non-technical users to have an understanding of what it takes to host SharePoint and why one might want to do so.  Therefore, this post provides a discussion of what it takes to host SharePoint and the driving factors for hosting SharePoint.

 

Microsoft’s original intent was to build a tool that was easy to leverage by non-technical users.  Microsoft thought of this as the natural extension of Office to the web[1].  That being said, the complexities got away from Microsoft, and in order to leverage a number of features one needs access to the back end.

Before delving into the SharePoint back end, let me point out that many businesses hire SharePoint development staff, both permanent and on a consulting basis. I think that developing custom SharePoint code should be done only after thoroughly justifying the expense.  It is often a mistake.  Instead, organizations should clearly define their requirements and then leverage a high quality third party add-on.  I will mention some of these at the end of the post.

SharePoint is a fragile product and therefore custom code for SharePoint is very expensive to develop, test, and deploy. Furthermore, custom code often needs to be rewritten when migrating to the next release of SharePoint.  Finally, SharePoint is a rapidly growing product, and chances are good that custom code may soon become obsolete by new features in the next generation.

In my first post, I pointed out that inexpensive SharePoint hosting options are available in the cloud. These options tend to be limited.  For example, the inexpensive rentals do not provide much security, only provide WSS (not MOSS), and do not allow one to add third party add-ins.  It is possible to lease custom environments that don’t surrender to any of these limitations, but they come at a cost.  (Typically starting at $500 per month[2].)  I believe that robust MOSS offerings with third party add-ons will be available at competitive prices within two years. 

——————————————————————————–

[1] SharePoint is developed by the Office division.

[2] For example, FPWeb offers a SharePoint hosted environment with the CorasWorks Workplace Suite included starting at $495 per month.

Continue reading

Google Wave Protocols: Clearing the Confusion

Today is the long-awaited day when 100,000 lucky individuals receive access to an early, but working, version of Google Wave. I hope I am in those ranks! Like many people, I have been reading about Wave, but have not been able to experience it hands-on

Wave has been a hot topic since it was first shown outside of Google last May. Yet it continues to be quite misunderstood, most likely because it is such an early stage effort and most interested people have not been able to lay hands on the technology. For that very reason, Gilbane Group is presenting a panel entitled Google Wave: Collaboration Revolution or Confusion? at the Gilbane Boston conference, on December 3rd.

The confusion surrounding Wave was highlighted for me yesterday in a Twitter exchange on the topic. It all started innocently enough, when Andy McAfee asked:

Andy1

To which I replied:

Larry1

That statement elicited the following comment from Jevon MacDonald of the Dachis Group:

Jevon1

I am not a technologist. I seek to understand technology well enough that I can explain it in layman’s terms to business people, so they understand how technology can help them achieve their business goals. So I generally avoid getting into deep technical discussions. This time, however, I was pretty sure that I was on solid ground, so the conversation between me and Jevon continued:

Larry2

Larry3

Jevon2

Now, here we are, at the promised blog post. But, how can Jevon and I both be correct? Simple. Google Wave encompasses not one, but several protocols for communication between system components, as illustrated in the figure below.

wave_protocols

Figure 1: Google Wave Protocols (Source: J. Aaron Farr,

The most discussed of these is the Google Wave Federation protocol, which is an extension of the Extensible Messaging and Presence Protocol (XMPP). However, Wave also requires protocols for client-server and robot server- (Web service) Wave server communication. It is also possible, but probably not desirable, for Wave to utilize a client-client protocol.

Jevon was absolutely correct about the XMPP protocol enabling server-server communication in the Google Wave Federation Protocol. The Draft Protocol Specification for the Google Wave Federation Protocol lays out the technical details, which I will not explore here. XMPP provides a reliable mechanism for server-server communication and is a logical choice for that function in Google Wave, because XMPP was originally designed to transmit instant message and presence data.

It turns out that the Google Wave team has not defined a specific protocol to be used in client-server communication. A Google whitepaper entitled Google Wave Data Model and Client-Server Protocol does not mention a specific protocol. The absence of a required or recommended protocol is also confirmed by this blog post. While the Google implementation of Wave does employ HTTP as the client-server protocol, as Jevon stated, it is possible to use XMPP as the basis for client-server communication, as I maintained. ProcessOne demonstrates this use of XMPP in this blog post and demo.

Finally, there is no technical reason that XMPP could not be used to route communications directly from one client to another. However, it would not be desirable to communicate between more than two clients via XMPP. Without a server somewhere in the implementation, Wave would be unable to coordinate message state between multiple clients. In plain English, the Wave clients most likely would not be synchronized, so each would display a different point in the conversation encapsulated in the Wave.

To summarize, Google Wave employs the following protocols:

  • XMPP for server-server communication
  • HTTP for client-server communication in the current Google implementation; XMPP is possible, as demonstrated by ProcessOne
  • HTTP (JSON RPC) for robot server-Wave server communication in the current Google implementation
  • Client-client protocol is not defined, as this mode of communication is most likely not usable in a Wave

I hope this post clarifies the protocols used in the current architecture of Google Wave for you. More importantly, I hope that it highlights just how much additional architectural definition needs to take place before Wave is ready for use by the masses. If I had a second chance to address Andy McAfee’s question, I would unequivocally state that Google Wave is a “concept car” at this point in time.

Postscript: The heretofore mentioned possibilities around XMPP as a client-client protocol are truly revolutionary.
The use of XMPP as the primary communication protocol for the Internet, instead of the currently used HTTP protocol, would create a next generation Internet in which centralized servers would no longer serve as intermediaries between users. Web application architectures, even business models, would be changed. See this post for a more detailed explanation of this vision, which requires each user to run a personal server on their computing device.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑