Curated for content, computing, and digital experience professionals

Tag: Web 2.0

Reflections on Gov 2.0 Expo and Summit

O’Reilly’s Gov 2.0 events took place last week. We’ve had some time to think about what the current wave of activity means to buyers and adopters of content technologies.

Both the Expo and Summit programs delivered a deluge of examples of exciting new approaches to connecting consumers of government services with the agencies and organizations that provide them.

  • At the Expo on Sept 8,  25 speakers from organizations like NASA, TSA, US EPA, City of Santa Cruz,  Utah Department of Public Safety, and the US Coast Guard provided five-minute overviews of their 2.0 applications in a sometimes dizzying fast-paced format.
  • Sunlight Labs sponsored an Apps for America challenge that featured finalists who combined federal content available on Data.gov and open source software in some intriguing applications, including DataMasher, which enables you to mash up sources such as stats on numbers of high school graduates and guns per household.
  • The Summit on Sept 9 and 10 featured more applications plus star-status speakers including Aneesh Chopra, the US’s first CTO operating under the Federal Office of Science and Technology Policy; Vinton Cerf, currently VP and evangelist at Google; and Mitch Kapor.

A primary program theme was “government as platform,” with speakers suggesting and debating just what that means. There was much thoughtful discussion, if not consensus. Rather than report, interested readers can search Twitter hash tags #gov20e and #gov20s for comments.

From the first speaker on, we were immediately struck by the rapid pace of change in government action and attitude about content and data sharing. Our baseline for comparison is Gilbane’s last conference on content applications within government and non-profit agencies in June 2007. In presentations and casual conversations with attendees, it was clear that most organizations were operating as silos. There was little sharing or collaboration within and among organizations. Many attendees expressed frustration that this was so. When we asked what could be done to fix the problem, we distinctly remember one person saying that connecting with other content managers just within her own agency would be a huge improvement.

Fast forward a little over two years to last week’s Gov2.0 events. Progress towards internal collaboration, inter-agency data sharing, and two-way interaction between government and citizens is truly remarkable. At least three factors have created a pefect storm of conditions: the current administration’s vision and mandate for open government, broad acceptance of social interaction tools at the personal and organizational level, and technology readiness in the form of open source software that makes it possible to experiment at low cost and risk.

Viewing the events through Gilbane’s content-centric lens, we offer three takeaways:

  • Chopra indicated that the formal Open Government directives to agencies, to be released in several weeks, will include the development of “structured schedules” for making agency data available in machine-readable format. As Tim O’Reilly said while interviewing Chopra, posting “a bunch of PDFs” will not be sufficient for alignment with the directives. As a result, agencies will be accelerating the adoption of XML and the transformation of publishing practices to manage structured content. As a large buyer of content technologies and services, government agencies are market influencers. We will be watching carefully for the impact of Open Government initiatives on the broader landscape for content technologies.
  • There was little mention of the role of content management as a business practice or technology infrastructure. This is not surprising, given that Gov2.0 wasn’t about content management. And while the programs comprised lots of show-and-tell examples, most were very heavy on show and very light on tell. But it does raise a question about how these applications will be managed, governed, and made sustainable and scalable. Add in the point above — that structured content will now be poised for wider adoption, creating demand for XML-aware content management solutions. Look for more discussion as agencies begin to acknowledge their content management challenges.
  • We didn’t hear a single mention of language issues in the sessions we attended. Leaving us to wonder if non-native English speakers who are eligible for government services will be disenfranchised in the move to Open Government.

All in all, thought-provoking, well-executed events. For details, videos of the sessions are available on the Gov2.0 site.

When is a Wiki a Whiteboard?

A: When its a huddle.

Q: When is a huddle an environment for multilingual communication?
A: When a huddlee can dynamically change the user interface to work in her native language.

Q: Why is this interesting?
A: Because we’ve yet to see a concentrated focus on globalization requirements in the social computing and collaboration space. In fact, we’ve been wondering where is the “L” is in Web 2.0?

Q: What if you don’t speak German?
A: The company that built and manages the huddle concept (Ninian Solutions Ltd) provides a French user interface as well and according to our interview with the company, Spanish, Chinese, and Japanese will follow.

Q: So how will content created by huddlers get translated?
A: Machine translation may very well prove its use within a Web 2.0 environment. Stay tuned.

Huddle

Wither Web 2.0? Come to Boston

Perhaps it’s cyclical — like the long Indian summer we’ve been having here in the Northeast. The Web/Enterprise/stuff “2.0” buzz has died down (for now) and we seem to be into the hard business of real application development. Perhaps this is a good thing — running on hype does little to transform businesses or pay the bills.

Certainly there’s been a lot of excitement around Facebook as a collaborative platform for digital natives (and fellow travelers). Yet the long-lasting innovation, I think, is around the APIs and the notion of “open platforms.” Of course Google was first to open the komono with its wildly popular Web services API into Google Maps. Now we’re trying to make mashups of social networks.

I’m curious but not convinced. Facebook is building out its community — Google is not far behind, pursuing the notion of social graphing. So far we can do all kinds of useful things in the consumer space. My favorite this week is friend finding — which also leverages GPS technology. But business applications? I haven’t heard of anything really compelling, yet. I’m still looking.

Which brings me to a preview of coming attractions. My colleagues Steve Paxhia, Nora Barnes, and I expect to cut through the Web 2.0 hype next month and shed some light on industry trends. We’ll be reporting the results of our industry survey at our Boston conference. We’ll have a statistically significant profille of what collaboration and social computing tools are being using in American businesses — beginning with email and Web sites and assessing many popular forms of social media. We’ll snapshot how effective companies rate these tools and also report on what each tool is best suited for. And I expect that before we’re done, we’ll have a few indicators of next generation collaborative business applications.

So join us, November 27th – November 29th in Boston.

Where is the “L” in Web 2.0?

I was only able to make it into the Enterprise 2.0 conference in Boston yesterday. You can still get a demo pass for today. But I was thrilled to hear analyst, researchers, case study presenters, and yes, even vendors, drill down into one of my favorite phrases: “people, processes, and technology make it possible” and hope the mantra continues today.

Point being, obviously, that 2.0 not just about technology ;-). Its about culture, filling generation gaps, the evolution of *people* networking, and redefining community from the core of where community starts. Humans.

What I didn’t hear, however, is the “L” word — specifically language, and that bothered me. We just can’t be naive enough to think that community, collaboration, and networking on a global scale is solely English-driven. We need to get the “L” word into the conversation.

My globalization practice colleague Kaija Poysti weighs in here.

The “2.0” Qualifier and A Reality Check

Last week’s FASTforward 07 event, sponsored by FAST Search, was a great opportunity to immerse ourselves in search and the state of our collective efforts to solve the knotty problems associated with finding information. (The escape to San Diego during an East Coast winter freeze was an added bonus.)

Much of the official program covered topics “2.0” — Web 2.0, Enterprise 2.0, Search 2.0, Transformation 2.0, Back Office 2.0. Regular readers know that the Gilbane team generally approaches most things “2.0” with skepticism. In the case of its use as a qualifier for the Web, it’s not that we question the potential value of bringing greater participation to Web-based interactions. Rather, it’s that use of the term causes the needle on our hype-o-meter to zip into the red alert zone. This reaction is further aggravated by the trend towards appending “2.0” to other words, sometimes just to make what’s old seem new again. We note, without comment, that O’Reilly Media’s conference in May has been dubbed Where 2.0.

We listened carefully to the 2.0’s being tossed out like Mardi Gras coins at FASTforward last week. One voice that stood out as a great reality check is that of Andrew McAfee, associate professor at Harvard Business School. In his keynote talk, “Enterprise 2.0: The Next Disrupter,” he presented a definition of Enterprise 2.0:

Enterprise 2.0 is the use of emergent social software platforms within companies, or between companies and their partners or customers.

The important word in McAfee’s definition is emergent, which is not the same as emerging. McAfee also outlined the ground rules for an enterprise that can legitimately lay claim to the use of the 2.0 qualifier. Read the FASTforward entries on his blog for his own eloquent summary.

In addition to his talk on Enterprise 2.0, McAfee also participated in a lunch presentation on research conducted by Economist Intelligence Unit on executive awareness of Web 2.0 and in a limited-seating roundtable on 2.0 topics. Both are briefly described on his blog.
In short, McAfee’s work is recommended reading for anyone interested in separating 2.0 market hype from potential business value.

Another highlight of FASTforward for us was keynoter Chris Anderson on “The Long Tail” and the application of Long Tail theories to search and content life cycles. By pure happenstance, the Gilbane team shared a limo to the airport with Anderson. In his day job as editor-in-chief of Wired magazine, he and his staff are experiencing significant levels of frustration with the publishing process — specifically, getting content out of a leading professional publishing tool and into the web content management system. While we found his Long Tail talk interesting, the conversation in the limo reminded us that solving some basic business communication problems is still a challenge. It was a thought-provoking way to end the week.

For more on FASTforward ’07, check out our enterprise search blog.

What’s Wrong with Web 2.0

In a word, “expectations”. There is nothing wrong with the moniker itself, but when used as if it were a thing-in-itself, as something concrete, it inevitably becomes misleading. This is not something to solely blame on marketing hype – people crave simple labels, marketers are just accommodating us. We need to take a little responsibility for asking what such labels really mean. When forced to reduce Web 2.0 to something real, you end up with AJAX. There is also nothing wrong with AJAX or its components. The problem is overestimating what it can do for us.

Bill Thompson’s post “Web 2.0 and Tim O’Reilly as Marshal Tito” yesterday on The Register’s Developer site, is perhaps a little overstated, but is useful reading for VCs and IT strategists. Here’s a sample:

Web 2.0 marks the dictatorship of the presentation layer, a triumph of appearance over architecture that any good computer scientist should immediately dismiss as unsustainable. … Ajax is touted as the answer for developers who want to offer users a richer client experience without having to go the trouble of writing a real application, but if the long term goal is to turn the network from a series of tubes connecting clients and servers into a distributed computing environment then we cannot rely on Javascript and XML since they do not offer the stability, scalability or effective resource discovery that we need.

Web 2.0, 3.0 and so on

The recent Web 2.0 conference predictably accelerated some prognostication on Web 3.0. I don’t think these labels are very interesting in themselves, but I do admit that the conversations about what they might be, if they had a meaningful existence, expose some interesting ideas. Unfortunately, they (both the labels and the conversations) also tend to generate a lot of over-excitement and unrealistic expectations, both in terms of financial investment and doomed IT strategies. Dan Farber does his usual great job of collecting some of the thoughts on the recent discussion in “Web 2.0 isn’t dead, but Web 3.0 is bubbling up“.

One of the articles Dan links to is a New York Times article by John Markoff, where John basically equates Web 3.0 with the Semantic Web. Maybe that’s his way of saying very subtly that there will never be a Web 3.0? No, he is more optimistic. Dan also links to Nick Carr’s post welcoming Web 3.0, but even Carr is gentler that he should be.

But here’s the basic problem with the Semantic Web – it involves semantics. Semantics are not static, language is not static, science is not static. Even more, rules are not static either, but at least in some cases, syntax, and logical systems have longer shelf lives.

Now, you can force a set of semantics to be static and enforce their use – you can invent little worlds and knowledge domains where you control everything, but there will always be competition. That’s how humans work, and that is how science works as far as we can tell. Humans will break both rules and meanings. And although the Semantic Web is about computers as much (or more) than about humans, the more human-like we make computers, the more they will break rules and change meanings and invent their own little worlds.

This is not to say that the goal of a Semantic Web hasn’t and won’t generate some good ideas and useful applications and technologies – RDF itself is pretty neat. Vision is a good thing, but vision and near-term reality require different behavior and belief systems.

© 2020 The Gilbane Advisor

Theme by Anders NorenUp ↑