Curated for content, computing, and digital experience professionals

Author: Barry Schaeffer

Focusing on the “Content” in Content Management

The growth in web-centric communication has created a major focus on content management, web content management , component content management, and so on. This interest is driven primarily by increasing demand for rich, interactive, accessible information products delivered via the Web. The focus is not misplaced but may be missing part of the point. To be specific, in our focus on the “management” part of CM, we may be missing the first word in the phrase…. “Content.”

It’s true that the application of increasing amounts of computer and brain power to the processes associated with preparing and delivering the kind of information demanded by today’s users can improve those products. But it does so within limits set by and at costs generated by the content “raw material” it gets from the content providers. In many cases, the content available to web product development processes is so structurally crude that it requries major clean-up and enhancement in order to adequately participate in the classification and delivery process. As the focus on elegant Web delivery increases, barring real changes in the condition of this raw content, the cost of enhancement is likely to grow proportionally, straining the involved organizations’ ability to support it.

The answer may be in an increased focus on the processes and tools used to create the original content. We know that the original creator of most content knows the most about how it should be logically structured and most about the best way to classify it for search and retrieval. Trouble is, in most cases, we provide no means of capturing what the creator knows about his or her intellectual product. Moreover, because many creators have never been able to fully populate the metadata needed to classify and deliver their content, in past eras, professional catalogers were employed to complete this final step. In today’s world, however, we have virtually eliminated the cataloger, assuming instead that the prodigious computer power available to us could develop the needed classification and structure from the content itself. That approach can and does work, but it will require better raw material if it is to achieve the level of effectiveness needed to keep the Web from becoming a virtual haystack in which finding the needle is more good luck than good measure. Native XML editors instead of today’s visually oriented word processors, spreadsheets, graphics and other media forms with content-specific XML under them, increased use of native XML databases and a host of rich content-centric resources are part of this content evolution.

Most important, however, may be promulgation of the realization across society that creating content includes more than just making it look good on the screen, and that the creator shares in that responsibility. This won’t be an easy or quick process, requiring more likely generations than years, but if we don’t begin soon, we may end up with a Web 3 or 4 or 5.0 trying to deliver content that isn’t even yet 1.0.

In the end, good search may depend on good source.

As the world of search becomes more and more sophisticated (and that process has been underway for decades,) we may be approaching the limits of software’s ability to improve its ability to find what a searcher wants. If that is true, and I suspect that it is, we will finally be forced to follow the trail of crumbs up the content life cycle… to its source.

Indeed, most of the challenges inherent in today’s search strategy and products appears to grow from the fact that while we continually increase our demands for intelligence on the back end, we have done little if anything to address the chaos that exists on the front end. You name it, different word processing formats, spreadsheets, HTML tagged text, database delimited files, and so on are all dumped into what we think of as a coherent, easily searchable body of intellectual property. It isn’t and isn’t likely to become so any time soon unless we address the source.

Having spent some time in the library automation world, I can remember the sometimes bitter controversies over having just two major foundations for cataloging source material (Dewey and LC; add a third if you include the NICEM A/V scheme.) Had we known back then that the process of finding intellectual property would devolve into the chaos we now confront, with every search engine and database product essentialy rolling its own approach to rational search, we would have considered ourselves blessed. In the end, it seems, we must begin to see the source material, its physcial formats, its logical organization and its inclusion of rational cataloging and taxonomy elements as the conceptual raw material for its own location.

As long as the word processing world teaches that anyone creating anything can make it look like it should in a dozen different ways, ignoring any semblance of finding-aid inclusion, we probably won’t have a truly workable ability to find what we want without reworking the content or wading through a haystack of misses to find our desired hits.

Unfortunately, the solutions of yesteryear, including after-creation cataloging by a professional cataloger, probably won’t work now either, for cost if no other reason. We will be forced to approach the creators of valuable content, asking them for a minimum of preparation for searching their product, and providing the necessary software tools to make that possible.

We can’t act too soon because, despite the growth of software elegance and raw computer power, this situation will likely get worse as the sheer volume of valuable content grows. Regards, Barry Read more: Enterprise Search Practice Blog:

Nuts and Bolts Tutorials at The Gilbane Conference

In a world that seems increasingly about technology itself, it has become tempting to assume that the questions and challenges of new and better information products is about the technology.  While it is true that technology is the key enabler of the new information world we are building, it is also true that the decision making and judgment involved in how that technology is to be organized and deployed is of equal–and not decreasing–importance.  Indeed, as the products move toward increasing sophistication and flexibility–smart content you might say–the importance of the human and organizational parts of the information life cycle become even more important. 

It is a truism that you cannot deliver information products you can’t create and manage, and with the circle of participants in that creation and management ever widening, we must be sensitive to the limits of the creators.  Moreover, while just "getting it up on the web" used to be at least sufficient to justify deployment of information products, today’s information consumer has a much more extensive and demanding list of features required before he will accept web-based information.  The publisher who forgets  or ignores that list is for trouble.

In a half-day session preceding the Gilbane conference next week, the Gilbance consulting team will tackle some of the real world challenges inherent in this rapidly changing information world, providing both sign posts for issues likely to come up and "in the trenches" suggestions for how to deal with them.  The goal of the session, scheduled for the afternoon of December 1, is that the attendees leave with a better handle on how to proceed in the quest for better information products and the role "smart content" should play. 

The presenters, in addition to their expertise in the technology and tools of information, bring a unique resource to their efforts: years of design, implementation and evaluation of real organizations facing real challenges.

Confronting the “Technology Imperative”

Technology is literally exploding: that’s a good thing isn’t it? PDAs, Twitter, iPods that do everything but cook, social networking and constant connectedness: all of it making our lives more in-touch, immediate, visual and interactive. There is, however, another side to this amazing progress. I like to call it the "technology imperative" and it grows from the fact that as technology and its use grows, it usually follows paths driven by consumers’ desires and willingness to spend money–whims if you will. Once unleashed, these technology-triggered, consumer driven appetites tend to return the favor, pointing the way to where and how their technology providers will go next. Sometimes the process literally becomes circular, taking the technology and its uses into a spiral no one would ever have predicted and for which no one is fully prepared. If you’re designing chips, selling gadgets or trolling Best Buy for the next version of the iPhone, this looks like the best of all possible worlds. The problem comes when non-consumer sectors of the culture begin to feel the impact of this race to connect. Technology is Neutral but its uses are Often a Poor Guide: In effect, consumer technology becomes the de facto guide for areas of our culture far from the environments for which it was designed and the modes in which consumers use it. For example, as we saw the rise of the Blackberry, instant email and messaging, we eventually saw workers, even in meetings, with their eyes and attention spans glued to their devices, scarcely even aware that they were supposed to be a contributing part of the meeting and its decision making. The situation became so widespread and vexing that many firms have literally banned PDAs from company meetings, and in 2006 a new condition known as Continuous Partial Attention Syndrome was identified in which the individual becomes so distracted by the overload of available information that any attempt to focus on a thought or subject is seriously degraded if not lost. In its extreme form, this syndrome sees the individual succumbing to a virtual addiction to instant information gratification, leading to a mind wandering in a sea of tidbits with no logical relationship to the subject at hand, even if that subject involves controlling a 4,000 pound automobile. Should Government Use Technology or Technology Drive Government? Today, technology has progressed far beyond those days, rudimentary by comparison, into a world of constant connectedness that can deliver not only the linkage but an intense, and seductive, visual, auditory and activity experience. With it, we are seeing an entirely new impact, especially pronounced in government sectors. Should government agencies, for example, put their important decisions out on Twitter and other social media to inform and elicit feedback from citizens? Sounds like a good way to improve the governing process, but in practice it has all manner of problems, not the least of which are mass responses that can overwhelm the agency’s ability to make sense of them, egalitarian leveling that makes everyone’s opinion on every subject of equal weight if not value, group influenced or generated responses that masquerade as individual opinions, and so on. In the intersection of government and technology, the technology is likely to come out on top, driving the governing process in directions it should not take, but becomes powerless to avoid. So what are we to do? Like Ulysses stuffing his crew’s ears with wax to avoid the clarion call of the Sirens, we must ignore how technology is taken up by the consumer world, no matter how enticing the outcome, concentrating instead on how the governing process may be improved by increased transparency and responsiveness. This concentration should be based on a healthy respect for the unintended consequences of any fundamental changes in the governing process coupled with an even healthier skepticism for any of the brave new world claims of the technological community. As we better understand what is broken in our governing process and what can be accomplished more effectively, we will have a foundation to consider, evaluate and adopt technology in a way the improves government as it was envisioned by our founders, always remaining mindful that government as we conceive it is not supposed to be slick or interactive but solid, fair and resistant to both individual whim and mob rule.

An Information Parable

With apologies to S. I. Hayakawa, whose classic "A Semantic Parable" has been a staple of virtually everyone’s education for more than a half-century.

Not so long ago nor perhaps all that far away, there existed a need for several departments of a huge organization to share information in a rapid and transparent way so that the business of the organization could be improved and its future made more secure.

Now each of these departments understood and agreed with the basic need for sharing, so no one expected there to be problems achieving the desired results.  Each department had its own IT group, working diligently using best practices of information technology as they understood them. When the need for information sharing among the departments became evident, the executive managers called a meeting of IT, operating managers and lead technologists from each department.  At this meeting, the executives explained that the need for a more transparent and flexible information environment among the departments and with the world outside.  Everyone nodded their agreement.

The IT manager of a major department exclaimed; "what we need is an enterprise information architecture; an EIA." Most of the other IT representatives agreed, and an effort to develop such an architecture was begun right there. The initiating department stated that because it had the largest and most mature IT infrastructure, the EIA should be modeled on technology approaches it was using.  Several other departments agreed–they had already adopted similar IT approaches and could easily participate in such an EIA.  Some other departments, however, having gone down different paths in their IT planning, took some issue with this suggestion. They feared that changing course to come in line with the suggested architecture could seriously disrupt their existing IT plan, funding and staffing.  Although willing to be good citizens, they were mindful that their first responsibility was to their own department.

More discussion ensued, suggesting and examining different IT concepts like J2EE, SOA, SQL, Web-centricity, BPR, and so on.  Several departments that had software capable of supporting it even mentioned XML. Like a Chinese puzzle, the group always found itself just short of consensus, agreeing on the basic concepts but each bringing variations in implementation level, manufacturer, etc., to the discussion.  In the end, tempers frayed by the seemingly endless circular discussions, the group decided to table further action until more detail about the need could be developed. Actually, nearly everyone in the room knew that they probably were, at that moment, as close to consensus as they were likely to get unless the top managers chose and mandated a solution. Anticipating just such a mandate, nearly every department descended on top management to make the case for its particular IT and EIA approaches, or , sensing defeat, for an exemption from whatever the decision turned out to be. The top managers of course, who knew little about the details of IT, were affected most by the size and clout of the departments beseeching them and by the visibility of the IT vendors they touted.  Battle lines were drawn between groups of departments, some of whom even went so far as to turn their vendors loose on top management to help make the case for their approach. Like molasses in winter, the entire situation began to congeal, making any movement-or communication among the departments for that matter-unlikely.  In the midst of this growing chaos, the original need to share information-and the information itself-was almost completely forgotten.

Then, when things looked terminal, someone from a department operating staff suggested that maybe things would work better if the organization just developed and adopted standards for the information to be exchanged and didn’t try to develop anything so far-reaching as an entire Enterprise Information Architecture. At first, no one listened to this obviously "un-IT" suggestion, but as things got worse and progress seemed out of reach, top management asked why the suggestion shouldn’t be considered.  After much grumbling, a meeting was called in which the staff making the suggestion laid out their ideas:

  • First, they said, we should decide what information must be exchanged among departments. We can do this based on our knowledge of the information content itself so we won’t need a great deal of technical skill beyond an understanding of the information standard selected.
  • Next, we might decide what interchange format will be use to exchange the information. It will be important that this format be capable of easy creation and ingestion by the IT tools in each participating department. XML seems to be a growing interchange format so maybe we should consider XML.
  • Then we can document what and how we want to exchange, and publish the documentation to every department so that their staffs can attend to the task of exporting and importing the desired information elements, taking care to avoid asking the departments to use any particular technology to accomplish this exchange, but with the easy availability of XML tools, that shouldn’t be difficult.
  • Then we may want to set some deadlines by which the various departments must be able to exchange information in the format we choose. That will ensure that the entire effort keeps moving and will help flush out problems that need more resources. Maybe if we just tell them the results we need, they won’t be so likely to resist.
  • Finally, we ask the various IT staffs to come up with their own technological approaches to the act of sharing: intranet, Internet, VPN, etc. They’re really good at this and they should have the say as to how it is done for their department.

After the presentation, there was silence in the room followed by some mildly contemptuous grumbling from some of the IT staff members in the back.

How, they whispered, could a complex challenge like integrating the organization’s IT systems into an EIA be dealt with by a few simplistic rules about data formats?   Finally, one of these malcontents gave voice to this objection, to which the presenter replied that the entire idea was to avoid impact on the complex ongoing IT activities of the various departments. The goal, he said, was to articulate what the organization needed in terms of information, leaving the approaches for its provision to each department’s IT staff. This, he said, would hopefully provide a level at which consensus could be reached, technologically based consensus having proven elusive for many reasons, some quite serious.

Sometimes, he said, it isn’t as important to create the impetus to force consensus, as it is to develop a rationale on which that consensus can be achieved and accepted voluntarily by the players. In the case of our hypothetical organization, there were reasons why the technological lives of the departments would never fully coincide and why each would resist even the weight of management dictates to do so. There were not, however, the same reasons why these departments could not agree on what the organization needed in shared information, if each department would be allowed to support the sharing in its own way.

The group thought about this radical departure from good systems engineering disciplines and began to realize that perhaps some integration challenges cannot be met by traditional (hard) systems and technology approaches–in fact, it may have taken quite some time and more conversations to reach this point. When this had finally penetrated, the departments agreed to base their collaboration on information itself, began the joint process of building needed interchange foundations, actually working with the operating staffs who created, used and understood the information–they chose XML and each departm
ent found that it had significant XML resources in the software it already used–and went back to work confident that they would be asked to give up neither their hard-won IT environment nor their autonomy as professionals.

As for the organization as a whole, over the next year or so it saw its information sharing begin to improve, spent relatively little of money doing it… and it was able to continue the practice of having all-hands holiday parties at which the department IT staffers and operating folks spoke to one another.

In The End, it’s Mostly About Content.

As the world of technology makes literally breathtaking strides, the world of automation finds itself increasingly focused on the technology. Indeed, in many areas of popular culture, the technology becomes an end in itself, conferring the patina of success on projects that are techno-heavy, never mind that they may not meet their objectives particularly well. This despite the pronoucements of virtually every management authority since the 60’s that technology and automation are different and the latter is the most important to success of the organization.

Nowhere is the tendency to focus on technology itself, to the detriment of meeting functional goals, more pronounced than in the general area referred to as “conent management” or CM, and in no part of CM has this tendency more clouded the picture than in the relationship of its semantic components; “Content” and “Management.” In today’s CM world, the focus on Management means that software and technology takes center stage with an implicit assumption that if one just adheres to the proper technological dictums and acquires the most powerful CM software, the effort will be successful. When efforts so constructed fail to meet their objectives, the further implicit assumption is that the technology… or the technology selection… or the technology governance… or the technology infrastructure has failed. In many cases while some of these may be true, they are not the reason for the failure.

Often, the cause of failure (or marginal performance) is the other side of the CM terminology; the content being created, managed and delivered.  Look closely at many automation environments and you will see a high-performance management and delivery environment being fed by virtually uncontrolled content raw material. If “you are what you eat”, so too is a content management and information delivery environment. In fact, failure at the delivery end is more often than not a failure to develop usable content instead of a failure of management and delivery technology. So why, with all the tools at our command, do we not address the content creation portions of our information life cycles?

I don’t claim to know all the answers, but have formed some impressions over the years:

FIRST: Content creation and its unwashed masses of authors,  providers and editors has traditionally been viewed as outside the confines of automation planning and development; indeed often as a detriment to automation rather than an integral part of the overall process. With that mentality, the system developers often stay completely away from content creation.

SECOND:  The world of software products and vendors, especially those in the management and delivery space, would rather spend more money on their systems in an attempt to make them resistant to the vagaries of uncontrolled content, of course at higher fees for their products. The world of content creation, if it can be called that, is still controlled by the folks in Renton, to their own corporate and marketing ends.

THIRD:  In most automation projects involving content, the primary resource is the IT group that, while highly capable in many cases, does not understand the world of content over which it does not itself have control. The result is usually a focus on the IT itself while the content creation groups in the organization find themselves outside with their noses pressed against the glass… until they are called in to be told what will be expected of them. The resulting fight often virtually dooms the project as it had originally need conceived.

So what should we do differently?

While every project is unique, here are some thoughts that might help:

FIRST: Understand that technology cannot fully make up for the absence of content designed and structured to meet the functional needs on the table.  Indeed, if it came to a choice between good content and high-performance management resources, content can be delivered with a surprisingly low level of technology while no amount of technology can make up for AWOL content.

SECOND: Accept the premise that well-designed content, fully capable of supporting the functional objectives of the project, should be the first order of business in any major project. With this, acknowledge that the content creators, while they may be less controlled and sometimes not easy to work with, are a critical component in the success of any project based on their output. In many content creation environments, negotiation between what would work best and what can be provided will result in a set of compromises that gets the best possible content within the constraints in place. That done, technology can be applied to optimize the life cycle flow of the content. Note that in this construction, the technology is a secondary factor, supporting but not defining the strategic direction of the project.

THIRD: Despite what you may hear from the software industry and its sales force, understand that in the term “content management”, content is the most important component and just buying more technology will not make up for its lack. From this understanding, you will be able to create a balance that accords both content and technology their rightfully important places in the overall effort.

Regards, Barry

Government confronts the new information world

With the rise of Web 2.0 and 3.0, growing Internet traffic, social networking and a host of other technologically driven applications and appetities, government at all levels is confronting the burgeoning changes in its role and participation in the society around it.

An important part of this process is the separation of the paths down which technology is taking society at large from the paths government should and should not follow in performing its essential functions. Experience has shown that not every tool, functionality and resource available to and used by citizens should become part of the governance process. The quandry is deciding up front which is which. This quandry can be seen in the very definition of government being used to described the future: “connected government”, “open government”, “participatory democracy”, “transparent government” are just some of the terms being used to describe what their users think government should be.

The core challenge, it would seem, is to develop an approach that makes government at once more effective in discharging its myriad day to day duties, more open and responsive to the honestly held beliefs and concerns of its citizens, yet still fully capable of discharging its constitutional responsibilities without infringing on or abrogating the rights of its citizens. History shows that this:

  • Will not be an easy process
  • Will not lend itself to a solution based solely on availablle technnology
  • Is likely to be tried unsuccessfully (or disastrously) more than once before we get it right.

This would seem to dictate that, whatever the technological imperatives, government should be changed carefully, in small steps and with well-considered fallbacks from the paths that turn out to be ineffective or dangerous to our liberties. One way to do this, for instance, would be to focus on those government functions we know are broken and understand how to fix (yes, there are such things.)  Then we could focus on applying new technology in areas where the target is familiar, the outcome more easily measured and the impact is less likely to spin out of control.

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑