Curated for content, computing, and digital experience professionals

Category: Content management & strategy (Page 1 of 476)

This category includes editorial and news blog posts related to content management and content strategy. For older, long form reports, papers, and research on these topics see our Resources page.

Content management is a broad topic that refers to the management of unstructured or semi-structured content as a standalone system or a component of another system. Varieties of content management systems (CMS) include: web content management (WCM), enterprise content management (ECM), component content management (CCM), and digital asset management (DAM) systems. Content management systems are also now widely marketed as Digital Experience Management (DEM or DXM, DXP), and Customer Experience Management (CEM or CXM) systems or platforms, and may include additional marketing technology functions.

Content strategy topics include information architecture, content and information models, content globalization, and localization.

For some historical perspective see:

https://gilbane.com/gilbane-report-vol-8-num-8-what-is-content-management/

Elastic’s Cloud Serverless now on Google Cloud Marketplace

From the Elastic blog…

Today, we are excited to announce the general availability of Elastic Cloud Serverless on Google Cloud — now available in the Iowa (us-central1) region. Elastic Cloud Serverless provides the fast way to start and scale observability, security, and search solutions without managing infrastructure. Built on the Search AI Lake architecture, which leverages Google Cloud Storage, it combines storage, separate storage and compute, low-latency querying, and advanced AI capabilities.

  • No compromise on speed or scale: Elasticsearch Serverless dynamically scales to accommodate your workload, handling unpredictable traffic spikes automatically — all while delivering low-latency search on boundless object storage.
  • Hassle-free operations: Say goodbye to managing clusters, provisioning nodes, or fine-tuning performance. Free your team from operational tasks — no need to manage infrastructure, do capacity planning, upgrade, or scale data. 
  • Purpose-built product experience: Elastic Cloud Serverless offers a streamlined workflow to help you create projects tailored to your unique use cases in observability, security, and search. With guided onboarding, you can use in-product resources and tools that guide you every step of the way, accelerating time to value.
  • Flexible usage-based pricing model: Elastic Cloud Serverless offers a usage-based pricing model that scales with your needs.

https://www.elastic.co/blog/elastic-cloud-serverless-google-cloud-general-availability

Sitecore launches AI Innovation Lab

Sitecore, a provider of digital experience and content management software, launched Sitecore AI Innovation Lab, a program created in collaboration with Microsoft that provides a guided environment for marketing professionals to rapidly explore AI-driven solutions for optimizing content operations. It helps marketers define their AI journey and fast track the development of solutions best suited for their specific use cases.

A Sitecore AI Innovation Lab guided environment explores AI-driven solutions with Microsoft Azure and Azure OpenAI Services. Participants work alongside Sitecore and Microsoft experts to prototype solutions for their unique challenges using an agile, low risk approach. The result is either a validated AI solution or learnings to help marketers achieve their business objectives. AI innovations developed with customers will be integrated into Sitecore’s DXP to further enhance, improve and future-proof the platform.

Sitecore recently released more than 250 enhancements for its composable DXP to help marketers create, build, and optimize digital experiences and content that align with their brand identity and drive engagement and conversions. Sitecore Stream helps marketers meet evolving customer expectations by speeding the creation and delivery of personalized digital experiences across all touchpoints.

The Sitecore AI Innovation Lab is available to current Sitecore customers.

https://www.sitecore.com/products/ai-innovation-lab

Altair partners with Databricks

Altair, a global provider in computational intelligence, has partnered with Databricks, a data and AI company, to provide joint customers with capabilities for data unification, graph-powered intelligence, and enterprise-grade artificial intelligence (AI).

Customers can use Altair RapidMiner to access, prepare, and analyze data in Databricks without data duplication. The platform’s full-stack AI capabilities—from low-code AutoML to MLOps, agent frameworks, and high-speed visualization, allow organizations to prototype, deploy, and scale AI applications using data stored in Databricks.

A key differentiator is Altair RapidMiner’s massively parallel processing (MPP) knowledge graph technology, purpose-built to support knowledge graph creation, data fabrics, and ontology modeling at enterprise scale. By integrating with Databricks, customers can use the Altair RapidMiner knowledge graph engine to connect, contextualize, and activate all types of data—structured, unstructured, and streaming. These graph-powered fabrics form the foundation for a new generation of intelligent systems, enabling generative AI models and autonomous agents to navigate the full complexity of an organization’s digital operations.

Altair RapidMiner also offers native support for SAS language execution, allowing customers to preserve and extend the value of their existing analytics investments while modernizing their workflows.

https://altair.com/altair-rapidminer ■ https://www.databricks.com

Introducing Aura-2: Enterprise-grade text-to-speech

Deepgram, a voice AI platform for enterprise use cases, announced Aura‑2, its next-generation text-to-speech (TTS) model purpose-built for real-time voice applications in mission-critical business environments. Engineered for clarity, consistency, low-latency performance, and deployable via cloud or on-premises APIs, Aura‑2 enables developers to build scalable, human-like voice experiences for automated interactions across the enterprise, including customer support, virtual agents, and AI-powered assistants. Aura-2 is built on Deepgram Enterprise Runtime—the infrastructure that powers the company’s speech-to-text (STT) and speech-to-speech (STS) capabilities, providing enterprises with the control, adaptability, and performance required to deploy and scale production-grade voice AI.

A significant gap exists between entertainment-focused models and the operational demands of enterprise-grade voice systems. Entertainment-focused TTS platforms are trained on and optimized for storytelling, character voices, and emotionally expressive delivery. Enterprise applications require more than natural-sounding voices, they demand domain-specific pronunciation, a professional tone, consistent contextual handling, and the ability to perform reliably, cost-effectively, and securely, often in environments that require full deployment control.

Aura-2 is powered by Deepgram Enterprise Runtime (DER)—a custom-built infrastructure layer that runs all of Deepgram’s speech models. Designed specifically for enterprise-grade performance, DER orchestrates voice AI in real time with the speed, reliability, and adaptability required for production-scale deployments.

Introducing Aura-2: Enterprise-Grade Text-to-Speech

InfluxData releases InfluxDB 3 Core and InfluxDB 3 Enterprise

InfluxData, a time series database, announced the general availability of InfluxDB 3 Core and InfluxDB 3 Enterprise. Built for rapid development and large-scale production, Core and Enterprise provide a high-performance, easily scalable database for managing time series data. InfluxDB 3 Core is an open source, high-speed, recent-data engine for real-time applications. InfluxDB 3 Enterprise adds high availability, enhanced security, and scalability for production environments. Both bring data transformation, enrichment, and alerting directly into the database with a built-in Python Processing Engine, elevating InfluxDB to an active intelligence engine for real-time data.

InfluxDB 3 Core is open source under the permissive MIT/Apache 2 license. InfluxDB 3 Enterprise extends Core’s capabilities with enterprise-grade features for production workloads, including multi-region durability, read replicas, automatic failover, and enhanced security. Both products run in a lightweight, single-node setup for fast, easy deployment.

Powered by the new InfluxDB 3 engine written in Rust and built with Apache Arrow, DataFusion, Parquet, and Flight, Core and Enterprise deliver performance gains and architectural flexibility compared to previous open source versions of InfluxDB.

InfluxDB 3 Core is now available as a free and open source download. InfluxDB 3 Enterprise is available for production deployments with flexible licensing options.

https://www.influxdata.com/blog/influxdata-announces-influxdb-3-OSS-GA

MindsDB brings federated data access to Model Context Protocol

MindsDB announced comprehensive support for the Model Context Protocol (MCP) across both its open source and enterprise platforms. This integration positions MindsDB as a unified AI data hub that standardizes and optimizes how AI models access enterprise data, simplifying artificial intelligence deployment in complex data environments.

The Model Context Protocol, an emerging open standard developed by Anthropic, creates a universal way for AI applications to connect with data sources and tools. By implementing MCP, MindsDB enables AI applications and agents to run federated queries over data stored in different databases and business applications as if they were a single database. MindsDB’s implementation extends beyond basic MCP compatibility to:

  • Provide one-step querying across multiple sources with comprehensive audit capabilities
  • Integrate with existing enforcement mechanisms to maintain data governance
  • Enable complex workflows including multi-source joins, automated data transformations, and natural language query conversion
  • Use native integrations when MCP support is insufficient and optimizes queries for efficient execution at the data source

MCP support is available in both MindsDB’s open source and enterprise editions. The open source version provides core MCP functionality for developers and small teams, while the enterprise edition adds advanced security, governance features, and premium support for organizational deployments.

https://mindsdb.com

Contentful launches new CMS capabilities and Shopify partnership

Contentful launched new features and partnerships that, with Contentful’s content management system (CMS) capabilities, deliver a modern solution for creating digital experiences. Anchored in core content operations, enhanced by AI, and featuring data-driven AI-enabled personalization at the core of the digital experience platform (DXP).

Contentful’s new AI and personalization capabilities build on the company’s composable CMS, a core platform tailored to technical users, and Contentful Studio, a low-code digital experience-building product for marketing teams.

AI Actions is Contentful’s framework for AI-powered content operations, enabling marketers to embed generative AI models into any stage of the content lifecycle from creation to adaptation to publishing. AI Actions remove the friction of repetitive marketing tasks, like translations and localization for global audiences, one-click SEO, and alt-text generation.

Contentful Personalization will live directly within the Contentful web app. Customers can easily find and manage all their audiences, experiments, and personalized experiences in one unified workspace. By centralizing everything in a single command center, Contentful streamlines personalization, making it faster and more efficient for the marketing team to optimize customer experiences.

Contentful and Shopify also announced a new strategic partnership designed to connect and synchronize content and commerce systems.

https://www.contentful.com/newsroom/contentful-reveals-next-phase-of-growth-with-modern-digital-experience

Access Innovations releases Data Harmony 3.17

Access Innovations, Inc. released the Version 3.17 update for its Data Harmony software suite. The newest version of the software includes knowledge maps that illustrate the linkages between categories and terms in a taxonomy. The knowledge maps have color-coded legends that tie to main categories of terms within each taxonomy to make them easier to decipher at a glance.

Other improvements in Data Harmony Version 3.17 include a Google Gemini integration so that taxonomies can link to a GPT model for a generative AI conversational search. Also, a new online dictionary link allows an individual who is building a custom taxonomy to look up terms without having to leave the platform in which they are working.

The software focuses on improving the quality of LLMs (large language models). Once someone queries a GPT or generative predictive text system, that system queries the LLM. The system compiles words gathered from the LLM according to the algorithms in the GPT, so the results from AI are only as good as the quality of the LLM data. By focusing on that data, Access helps AI perform better with more accuracy and efficiency. The software categorizes knowledge assets to improve search precision.

https://www.accessinn.com/data-harmony-products/

« Older posts

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑