The Gilbane Advisor

Curated for content, computing, and digital experience professionals

Page 251 of 917

Tell Us About Your Favorite Web 2.0 Tool

There sure is a lot of news about Web 2.0 these days. It can be hard to take it all in, and there seems to be new tools every day! So how to make sense of it all.

One way to learn more about these tools is to attend the session I will be hosting at the Gilbane San Francisco Conference (http://gilbanesf.com) in June called “My Favorite Web 2.0 Tool“. It will be organized in the fast paced “Lightning Round” style, with 10 speakers covering 10 topics in 60 minutes (yes, that is about 5 minutes each). This unique presentation format allows for presentation of many ideas at once, encourages audience participation, and tends to be fairly hilarious.

Got something to say about Web 2.0 tools? I would love to hear from people interested in participating in this lightning round. Send me a one paragraph description of why your favorite Web 2.0 tool should be included in this session (send to dale@gilbane.com). We’re open to a broad definition of Web 2.0 tools too. We are looking for innovative ideas, game changers, or even just entertaining or fun apps!

We would love to hear from you!. The slots will fill up fast so don’t wait if you hope to participate.

See you in San Francisco!

An Information Parable

With apologies to S. I. Hayakawa, whose classic "A Semantic Parable" has been a staple of virtually everyone’s education for more than a half-century.

Not so long ago nor perhaps all that far away, there existed a need for several departments of a huge organization to share information in a rapid and transparent way so that the business of the organization could be improved and its future made more secure.

Now each of these departments understood and agreed with the basic need for sharing, so no one expected there to be problems achieving the desired results.  Each department had its own IT group, working diligently using best practices of information technology as they understood them. When the need for information sharing among the departments became evident, the executive managers called a meeting of IT, operating managers and lead technologists from each department.  At this meeting, the executives explained that the need for a more transparent and flexible information environment among the departments and with the world outside.  Everyone nodded their agreement.

The IT manager of a major department exclaimed; "what we need is an enterprise information architecture; an EIA." Most of the other IT representatives agreed, and an effort to develop such an architecture was begun right there. The initiating department stated that because it had the largest and most mature IT infrastructure, the EIA should be modeled on technology approaches it was using.  Several other departments agreed–they had already adopted similar IT approaches and could easily participate in such an EIA.  Some other departments, however, having gone down different paths in their IT planning, took some issue with this suggestion. They feared that changing course to come in line with the suggested architecture could seriously disrupt their existing IT plan, funding and staffing.  Although willing to be good citizens, they were mindful that their first responsibility was to their own department.

More discussion ensued, suggesting and examining different IT concepts like J2EE, SOA, SQL, Web-centricity, BPR, and so on.  Several departments that had software capable of supporting it even mentioned XML. Like a Chinese puzzle, the group always found itself just short of consensus, agreeing on the basic concepts but each bringing variations in implementation level, manufacturer, etc., to the discussion.  In the end, tempers frayed by the seemingly endless circular discussions, the group decided to table further action until more detail about the need could be developed. Actually, nearly everyone in the room knew that they probably were, at that moment, as close to consensus as they were likely to get unless the top managers chose and mandated a solution. Anticipating just such a mandate, nearly every department descended on top management to make the case for its particular IT and EIA approaches, or , sensing defeat, for an exemption from whatever the decision turned out to be. The top managers of course, who knew little about the details of IT, were affected most by the size and clout of the departments beseeching them and by the visibility of the IT vendors they touted.  Battle lines were drawn between groups of departments, some of whom even went so far as to turn their vendors loose on top management to help make the case for their approach. Like molasses in winter, the entire situation began to congeal, making any movement-or communication among the departments for that matter-unlikely.  In the midst of this growing chaos, the original need to share information-and the information itself-was almost completely forgotten.

Then, when things looked terminal, someone from a department operating staff suggested that maybe things would work better if the organization just developed and adopted standards for the information to be exchanged and didn’t try to develop anything so far-reaching as an entire Enterprise Information Architecture. At first, no one listened to this obviously "un-IT" suggestion, but as things got worse and progress seemed out of reach, top management asked why the suggestion shouldn’t be considered.  After much grumbling, a meeting was called in which the staff making the suggestion laid out their ideas:

  • First, they said, we should decide what information must be exchanged among departments. We can do this based on our knowledge of the information content itself so we won’t need a great deal of technical skill beyond an understanding of the information standard selected.
  • Next, we might decide what interchange format will be use to exchange the information. It will be important that this format be capable of easy creation and ingestion by the IT tools in each participating department. XML seems to be a growing interchange format so maybe we should consider XML.
  • Then we can document what and how we want to exchange, and publish the documentation to every department so that their staffs can attend to the task of exporting and importing the desired information elements, taking care to avoid asking the departments to use any particular technology to accomplish this exchange, but with the easy availability of XML tools, that shouldn’t be difficult.
  • Then we may want to set some deadlines by which the various departments must be able to exchange information in the format we choose. That will ensure that the entire effort keeps moving and will help flush out problems that need more resources. Maybe if we just tell them the results we need, they won’t be so likely to resist.
  • Finally, we ask the various IT staffs to come up with their own technological approaches to the act of sharing: intranet, Internet, VPN, etc. They’re really good at this and they should have the say as to how it is done for their department.

After the presentation, there was silence in the room followed by some mildly contemptuous grumbling from some of the IT staff members in the back.

How, they whispered, could a complex challenge like integrating the organization’s IT systems into an EIA be dealt with by a few simplistic rules about data formats?   Finally, one of these malcontents gave voice to this objection, to which the presenter replied that the entire idea was to avoid impact on the complex ongoing IT activities of the various departments. The goal, he said, was to articulate what the organization needed in terms of information, leaving the approaches for its provision to each department’s IT staff. This, he said, would hopefully provide a level at which consensus could be reached, technologically based consensus having proven elusive for many reasons, some quite serious.

Sometimes, he said, it isn’t as important to create the impetus to force consensus, as it is to develop a rationale on which that consensus can be achieved and accepted voluntarily by the players. In the case of our hypothetical organization, there were reasons why the technological lives of the departments would never fully coincide and why each would resist even the weight of management dictates to do so. There were not, however, the same reasons why these departments could not agree on what the organization needed in shared information, if each department would be allowed to support the sharing in its own way.

The group thought about this radical departure from good systems engineering disciplines and began to realize that perhaps some integration challenges cannot be met by traditional (hard) systems and technology approaches–in fact, it may have taken quite some time and more conversations to reach this point. When this had finally penetrated, the departments agreed to base their collaboration on information itself, began the joint process of building needed interchange foundations, actually working with the operating staffs who created, used and understood the information–they chose XML and each departm
ent found that it had significant XML resources in the software it already used–and went back to work confident that they would be asked to give up neither their hard-won IT environment nor their autonomy as professionals.

As for the organization as a whole, over the next year or so it saw its information sharing begin to improve, spent relatively little of money doing it… and it was able to continue the practice of having all-hands holiday parties at which the department IT staffers and operating folks spoke to one another.

Webinar: Global Content and Customer Satisfaction

April 21, 11:00 am ET

A solid strategy for weathering any economic storm is to forcus on finding and serving your most profitable customers. In any region, in any language, across all interactions. How can global enterprises tune their content practices to support this new laser focus on audience engagement and align their processes with corporate strategic objectives?

Gilbane’s Mary Laplante and Sophie Hurst, Director, Product Marketing at SDL, discuss the issues, challenges and opportunities associated with delivering multilingual content that meets today’s mandate for extraordinary customer experience. Using Gilbane’s research and insights on aligning global content with business value as background, topics include:
 
  • Market factors influencing global content management practices.
  • Real-world approaches to meeting audience demand for multilingual content, based on Gilbane research and SDL customer solutions.
  • Establishing a roadmap for enhancing global content practices to align them more closely with customer experience initiatives.

Registration is open. Sponsored by SDL.

The Accountant Who Knew Too Much

It was a dark and rainy night. She toiled way past normal quitting time for all but accountants with Securities and Exchange Commission (SEC) filings deadlines looming. Cranking away on her first XBRL SEC filing, Debbie became quite frustrated. "I know what is supposed to go into all of these "other" accounts, but XBRL just doesn’t care," she lamented. You see, Debbie is the accountant who knew too much.

Debbie is not alone. In a recent conversation with Louis Matherne, former XBRL International President and Director, XBRL Services for Clarity Systems, the situation described above is actually a common occurrence for accountants tagging XBRL filings for the first time. He suggested a simple example:

The company balance sheet says "prepaid expenses and other". "Other" is there because it represents several accounts that aggregated to the balance sheet become immaterial as separate items. The registrant, however, knows what it is and thinks they should create a new taxonomy concept that better captures the details of "other". No where in the financial statements or footnotes to the financial statements do they describe what that ‘other’ is."

The object of using XBRL for compliance with the SEC mandate is to present the company’s required financial statements and footnote disclosures, not to expose the preliminary accounts and internal decisions the led to the final, top level reports. XBRL is not meant to extend, expand or further explain legal filings. It simple puts your disclosures into a machine readable form. The last word comes from Matherne: "XBRL for the SEC is primarily about the disclosure of the accounting".

The US GAAP XBRL taxonomies can be found here.

The Accountant Who Knew Too Much

Social action in the Arctic

In addition to contributing to the Globalization blog, I will also be blogging a bit about what is happneing in social media in Europe. I will start form my own region here in the north, but will move also southwards!

Although it is still cold here close to the Arctic regions, the social media scene is humming. One site to visit is www.arcticstartup.com, where Ville Vesterinen and his friends blog about social media business in Scandinavia and the Baltic region. In addition to the blog, which tells about the latest news about internet and mobile startups in the Nordic and Baltic region, they are also helping to create a buzzing ecosystem of sharing ideas and growing companies together. Great job!

Another place to get acquainted with Finnish social media companies is www.sombiz.net.

There are plenty of interesting social media companies in Scandinavia and Baltics – some of them even surprising. Muxlim, www.muxlim.com, a global Muslim community, has its roots in Finland. Games are another strong area here; www.playfinland.fi is a great site to follow news about the Finnish game development scene. Max Payne came from Finland, and there are several interesting new companies, such as Frosmo (www.frosmo.com).

In addition to consumer social media, things are happening in the Enterprise 2.0 area, but more on that in next entries!

One of the areas which I will be interested to follow is: how will European social media companies address the question of handling languages in social media? Although many communities will be monolingual, I think there are enormous needs in corporations to handle multiple languages in the various Enterprise 2.0 applications. And with multiple languages, I mean much more than the user interface: all the user-generated content, communicating with customers, open innovations etc. Thoughs or comments on this?

Happy Birthday to the Wiki!

The first wiki, WikiWikiWeb, was created 14 years ago today, by Ward Cunningham. Since then, the wiki has become one of the most widely deployed collaboration tools available. One might even call the wiki the catalyst of the Social Software movement.

Why is the wiki so popular? There are several reasons, including ease of use, structured navigation, and the ability to track changes to wiki pages and roll back to previous versions. The democratic nature of the format, in which anyone who has access can edit the wiki, is undoubtedly a major contributor to its success as well.

The primary reason for the wiki’s success is its flexibility. Wikis have been used for everything from collaboratively authoring a document, to managing a project, to establishing a corporate knowledge base. We are seeing the same phenomenon today in Twitter, which is being used in ways that its creators never imagined.

So, at age 14, what has the wiki taught us? That collaboration tools should be designed for flexible, yet intuitive, use. Complexity is kryptonite to collaboration. Let’s remember that before we build and deploy enterprise collaboration software.

Tweeting XBRL

Over the last few months, I have become acquainted with the wonders of the 140 character “tweet”. For those of you who are not “tweets”, I am referring to the combination of instant message and social networking that has converged at www.twitter.com. In essence, twitter asks “peeps” a simple question, What are you doing right now? In 140 spaces, you can communicate what you are thinking or what you have just read on the web.  Long URL’s are easily truncated leaving enough space to communicate simple messages.  If you find people that are doing or reading about interesting things, you can follow their “tweets”.

In efforts to keep up on XBRL, I use Yahoo and Google key word alerts as well as selected RSS feeds from the SEC and others. I have found, however, that this system falls short of the daily updates the people using www.twitter.com are providing for me. Its amazing how effective a 140 character message can be in sending you directly to fresh web content relevant to your interests. To improve my “hit” rate, I’ve added a few Tweet favorite tools such as tweetdeck (www.tweetdeck.com), TwitScoop, WeFollow, and MrTweet. Join up and send me a note. I’d love to follow you!

In The End, it’s Mostly About Content.

As the world of technology makes literally breathtaking strides, the world of automation finds itself increasingly focused on the technology. Indeed, in many areas of popular culture, the technology becomes an end in itself, conferring the patina of success on projects that are techno-heavy, never mind that they may not meet their objectives particularly well. This despite the pronoucements of virtually every management authority since the 60’s that technology and automation are different and the latter is the most important to success of the organization.

Nowhere is the tendency to focus on technology itself, to the detriment of meeting functional goals, more pronounced than in the general area referred to as “conent management” or CM, and in no part of CM has this tendency more clouded the picture than in the relationship of its semantic components; “Content” and “Management.” In today’s CM world, the focus on Management means that software and technology takes center stage with an implicit assumption that if one just adheres to the proper technological dictums and acquires the most powerful CM software, the effort will be successful. When efforts so constructed fail to meet their objectives, the further implicit assumption is that the technology… or the technology selection… or the technology governance… or the technology infrastructure has failed. In many cases while some of these may be true, they are not the reason for the failure.

Often, the cause of failure (or marginal performance) is the other side of the CM terminology; the content being created, managed and delivered.  Look closely at many automation environments and you will see a high-performance management and delivery environment being fed by virtually uncontrolled content raw material. If “you are what you eat”, so too is a content management and information delivery environment. In fact, failure at the delivery end is more often than not a failure to develop usable content instead of a failure of management and delivery technology. So why, with all the tools at our command, do we not address the content creation portions of our information life cycles?

I don’t claim to know all the answers, but have formed some impressions over the years:

FIRST: Content creation and its unwashed masses of authors,  providers and editors has traditionally been viewed as outside the confines of automation planning and development; indeed often as a detriment to automation rather than an integral part of the overall process. With that mentality, the system developers often stay completely away from content creation.

SECOND:  The world of software products and vendors, especially those in the management and delivery space, would rather spend more money on their systems in an attempt to make them resistant to the vagaries of uncontrolled content, of course at higher fees for their products. The world of content creation, if it can be called that, is still controlled by the folks in Renton, to their own corporate and marketing ends.

THIRD:  In most automation projects involving content, the primary resource is the IT group that, while highly capable in many cases, does not understand the world of content over which it does not itself have control. The result is usually a focus on the IT itself while the content creation groups in the organization find themselves outside with their noses pressed against the glass… until they are called in to be told what will be expected of them. The resulting fight often virtually dooms the project as it had originally need conceived.

So what should we do differently?

While every project is unique, here are some thoughts that might help:

FIRST: Understand that technology cannot fully make up for the absence of content designed and structured to meet the functional needs on the table.  Indeed, if it came to a choice between good content and high-performance management resources, content can be delivered with a surprisingly low level of technology while no amount of technology can make up for AWOL content.

SECOND: Accept the premise that well-designed content, fully capable of supporting the functional objectives of the project, should be the first order of business in any major project. With this, acknowledge that the content creators, while they may be less controlled and sometimes not easy to work with, are a critical component in the success of any project based on their output. In many content creation environments, negotiation between what would work best and what can be provided will result in a set of compromises that gets the best possible content within the constraints in place. That done, technology can be applied to optimize the life cycle flow of the content. Note that in this construction, the technology is a secondary factor, supporting but not defining the strategic direction of the project.

THIRD: Despite what you may hear from the software industry and its sales force, understand that in the term “content management”, content is the most important component and just buying more technology will not make up for its lack. From this understanding, you will be able to create a balance that accords both content and technology their rightfully important places in the overall effort.

Regards, Barry

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑