Curated for content, computing, and digital experience professionals

Category: Enterprise software & integration (Page 26 of 31)

Keynote Debate: Microsoft & Sun: What is the Right XML Strategy for Information Interchange?

I am liveblogging the Keynote Debate between Microsoft and Sun on what is the right strategy for information interchange. The panelists are Tim Bray, Director, Web Technologies, Sun Microsystems, and Jean Paoli, Senior Director, XML Architecture, Microsoft. Jon Udell is moderating.

  • Actually Frank Gilbane is moderating, and not Jon, so we will hear some of Jon’s thoughts as well
  • Frank: the session is really about strategies for sharing, preserving, and integrating document content, especially document content with XML.
  • Frank gave some background about the European Union attempts to standardize on Microsoft Office or OpenOffice
  • Tim elucidated some requirements of your data format. (1) Technically unencumbered and legally unencumbered (2) High quality (and a notable aspect of quality is allowing a low barrier to entry). Tim: “As Larry Wall (the inventer of Perl) noted, easy things should be easy, and hard things should be possible).”
  • Jean predicted that by 2010, 75% of new documents will be XML.
  • Tim agreed with Jean that 75% of new documents will be XML by 2010, but asked how many of them will be XHTML (as opposed toa more specialized schema, I assume).
  • Some agreement by all that electronic forms are an important aspect of XML authoring, but Tim thinks the area is “a mess.” I’m paraphrasing, but Tim commented on the official XForms release, “Well, it’s official.”
  • Jean commented that XML-based electronic forms are made more difficult because forms themselves require consideration of graphical user interface, interactivity, and even personalization to a degree. This suggests forms are more complex than documents. (And this reminds me of a comment Mark Birbeck made about there being a fine line between an electronic form and an application.)
  • Good question from the audience. So much time has elapsed since SGML got started, and we are still only have XSL-FO (which this person was not happy with). What does this suggest about how long it will take to get better, high-quality typographically sophisticated output?
  • Tim would suggest we are seeing some improvement, beginning with better resolution on the screen.
  • Another commenter weighed in, suggesting that format is important and format does convey meaning. Would like to hear that the tools are going to get better.
  • Frank: when do you need a customized schema?
  • Jean: best way to safeguard your data and systems is to have an XML strategy. You can gain efficiencies you never had before. Also suggested that the Microsoft schemas will not somehow trap your content into Microsoft’s intellectual property.
  • Jon’s takeaways: (1) software as service (2) XML-aware repositories and (3) pervasive intermediation (the content flows in such a way that you can intermediate it)

A New Reality for XML and Web Services?

When business people want to condemn a new technology to a geeky grave, they often say that the new thing is “a technology in search of a problem.” This suggests–quite correctly–that the best technology solves a pressing business problem.
Web services, specifically, and service-oriented architectures in general, solve a number of pressing business problems. In particular, web services allow organizations to continue operating legacy systems that work well and that, for various reasons, defy replacement or upgrade. If you can at least reach a point where the legacy system can be integrated with other applications via web services, you likely have a moderate-cost, stable, and workable means to integrate the legacy system with web-facing applications going forward.

Continue reading

Longhorn adoption, file systems & content technology

Dan Farber raises the issue of Longhorn adoption and quotes a Jupiter analyst who claims the challenge is that XP is “good enough”. There is actually a more fundamental reason the question of adoption is interesting. What is that and what does it have to do with content technology?

I’ll start the answer with a little history. In 1994 at our first Documation conference, I moderated a debate between Tony Williams, Chief Architect of COM at Microsoft, and Larry Tesler, Chief Scientist at Apple. The Microsoft COM and OFS/Cairo and Apple OpenDoc efforts both recognized the need for operating systems to provide more support for the richness of unstructured information than is possible with the primitive file systems we had then.

Before the debate I preferred the OpenDoc approach because it seemed more consistent with my view that new operating systems needed to be able to manage arbitrary information objects and structures that could be described with a markup language (like SGML at the time). However, Tony convinced me that OpenDoc was too radical a change for both users and developers at the time. Tony agreed with the ultimate need to make such a radical change to file systems to support the growing need for applications to manage more complex content, but he said that Microsoft had decided the world was not ready for such a shock to the system yet, and defended their strategy as the more realistic.

Eleven years later and we are still stuck with the same old-fashioned file system in spite of the fact that every modern business application needs to understand and process multiple types of information inside files. This means that database platforms and applications need to do a lot more work than they should to work with content. I am no expert on Longhorn, but the file system that will be part of it (although maybe not initially), WinFS, is supposed to go a long way towards fixing this problem. Is the world ready for it yet? I hope so, but it will still be a big change, and Tony’s concerns of 1994 are still relevant.

Binary XML

“Binary XML” sounds like an oxymoron. It is, after all, the plain text encoding of XML that makes it so easy to work with. Heck, I still use the “vi” editor to make quick changes to XML and HTML files.
Writing in the Australian edition of Builder.com, Martin LaMonica provides a nice roundup of the pros and cons of some efforts to develop a binary XML. He summarizes some related projects at Sun and the W3C, and has some very lively quotes from XML guru (and Gilbane Report Editor Emeritus) Tim Bray. (And if you want to hear directly from Tim on the issue of binary XML, his blog has plenty of related entries.)
I’ll leave it up to people much smarter than me to figure this one out, but the discussion of binary XML is related to the larger question of performance. As XML is more and more pervasive, organizations will need to find ways to deal with performance impacts over time. We talked about XML hardware in this context a few days ago, and ZDNet is reporting today that Cisco may be getting in the XML hardware game. Stay tuned.

DataMirror Unveils Integration Suite 2005

DataMirror Corporation introduced Integration Suite 2005, a solution designed to solve real-time data integration challenges across relational databases and disparate computing platforms. DataMirror Integration Suite 2005 combines integration technology with global services. The data integration functionality and services is interoperable, allowing customers to utilize specific data integration capabilities separately or in unison. The Suite also supports a host of business applications, including active data warehousing, business intelligence, business activity monitoring, customer relationship management, e-Business, data auditing/compliance, data distribution, and mobile computing. DataMirror introduces a new version of its data integration software solution, DataMirror Transformation Server as a core component of Integration Suite 2005. With increased functionality aimed at helping companies share up-to-date data with customers, partners, and employees, Transformation Server 5.2 includes added flexibility, interoperability, and performance. www.datamirror.com

SchemaLogic & Meta Integration Technology in Agreement

SchemaLogic and Meta Integration Technology announced they have formed a technology alliance where SchemaLogic customers will be able to utilize over 50 additional adaptors to create a shared, cross-system, “active” metadata repository. Meta Integration Model Bridge connects to databases or information models from IBM, Oracle, Sybase, SAS, Business Objects, IBM Rational, Computer Associates, OMG and W3C. SchemaLogic provides a framework for shared metadata based on an active repository, a unified information model, collaborative change management with impact analysis, notification and approval, plus the synchronization of approved changes to subscribing systems using XML, SOAP and Web Services. This provides a holistic view of information assets including content, data and XML: who is responsible for each asset, how they’re organized (structure and semantics) and the relationships among them. Information architects, database analysts, content system managers and developers can see and control metadata definitions, taxonomies, hierarchical lists and vocabularies in one repository, available throughout the enterprise. www.metaintegration.com, www.schemalogic.com

 

Ipedo Announces New Multi-Source Enterprise Information Integration (EII) Capabilities

Ipedo announced their software platform now handles concurrent assimilation of complex information sets from Oracle 9i, DB2, MySQL, SQL Server and Web Services. The information integration was done using Ipedo’s XML-based views and query technologies. With this new capability, Ipedo allows enterprises to create custom information views on demand. The XML-driven EII capabilities of the Ipedo XML Information Hub go beyond integration and allow custom assembly and persistence of information, in the variety of industry standard formats. The testing was done using the Ipedo XML Information Hub 3.3 with Oracle 9iR2, DB2 Universal Database v3.1, MySQL Pro v4.0, SQL Server 2000 Enterprise Edition, and a public Web Service from Google. Information was integrated using XQuery to quickly aggregate real-time business information across multiple sources, leveraging Ipedo’s XML Views, Web Services Views, Content Conversion and Universal XQuery Engine capabilities. www.Ipedo.com

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑