Curated for content, computing, data, information, and digital experience professionals

Category: Enterprise software & integration (Page 2 of 36)

Algolia unveils AI Agents for Salesforce and Adobe

Algolia, a AI-native search and discovery platform, announced a new AI agent experience that illustrates how autonomous systems can act on real-time data across enterprise platforms. Demonstrated using Salesforce’s Agentforce and integrated with Adobe Experience Manager (AEM), Adobe Experience Platform (AEP), and Salesforce Commerce Cloud (SFCC), Algolia shows what’s possible when agents are equipped with live, structured context, without latency or hallucination.

Algolia semantically interprets user intent, retrieves structured content from its index containing data from multiple customer datastores, and assembles context-aware responses in real time—bridging the gap between front-end agent platforms like Agentforce and the backend systems that hold critical content and customer signals.

Algolia handles this orchestration through its AI-native search engine, which retrieves the most relevant information from its index, content that has been ingested and structured from platforms like Adobe Experience Manager (AEM), Salesforce Commerce Cloud (SFCC), and Adobe Experience Platform (AEP). Whether surfacing personalized media, live product availability, or behavioral attributes, Algolia assembles responses that reflect the full customer context and returns them in milliseconds. The result is an agent experience that feels intuitive, precise, and deeply responsive to user needs.

https://www.algolia.com/about/news/algolia-unveils-new-real-time-context-aware-ai-agents-across-salesforce-and-adobe

Bloomreach adds marketing and e-commerce features

Bloomreach, an agentic platform for personalization, announced features to transform AI-driven personalization and customer engagement across marketing and product discovery. From workflows to conversational search, these features highlight agentic AI.

Marketing:

  • Autonomous Marketing Agents: AI-powered agents to help marketers automate and execute the entire campaign creation process.
  • Recommendations+ : Analyzes each customer’s journey and behavior in real-time to recommend products that align with a customer’s preferences.
  • Contextual Personalization: Automates personalization through the delivery of individualized emails, mobile messages and onsite experiences to each customer.

⠀Conversational Shopping:

  • Search Triggered Conversations: Launches a personalized conversation directly from the search bar and acts as an integrated shopping assistant.
  • Embedded Conversations: Brings Clarity’s conversational capabilities directly to Product Detail Pages (PDPs) and Product Listing Pages (PLPs).

⠀Search:

  • Personalization Studio: Learns from live customer signals and optimizes in real-time to reflect current shopper intent.
  • Ranking Studio: Gives practitioners control to integrate critical business signals like margins or offline sales into search algorithms.
  • Multi-language search: Expands global reach by extending autonomous search across 33 supported languages.
  • Conditional Slot Merchandising: Elevates product placement by allowing the merchandiser to define their own business-driven conditions so AI can autonomously populate product grids and placements.

https://www.bloomreach.com/en/news/2025/bloomreach-unveils-the-features-ushering-in-the-agentic-era-of-marketing-and-ecommerce/

OpenSearch releases OpenSearch 3.0

The OpenSearch Software Foundation, the vendor-neutral home for the OpenSearch Project, announced the general availability of OpenSearch 3.0. OpenSearch 3.0 enables users to increase efficiency, deliver superior performance, and accelerate AI application development via new data management, AI agent, and vector search capabilities.

Vector engine features:

  • GPU Acceleration for OpenSearch Vector Engine: Delivers superior performance for large-scale vector workloads while significantly lowering operational spend by reducing index building time.
  • Model Context Protocol (MCP) support.
  • Derived Source: Reduces storage consumption by removing redundant vector data sources and utilizing primary data to recreate source documents.

⠀Data management features:

  • Support for gRPC: Enables faster and more efficient data transport and data processing for OpenSearch deployments.
  • Pull-based Ingestion: Enhances ingestion efficiency and gives OpenSearch more control over the flow of data and when it’s retrieved by decoupling data sources and data consumers.
  • Reader and Writer Separation: Ensures consistent, high-quality performance for indexing and search workloads by configuring each in isolation.
  • Apache Calcite Integration.
  • Index Type Detection: automatically determining whether an OpenSearch index contains log-related data and speeding up log analysis feature selection.

Other updates include: Lucene 10, Java 21 minimum supported Runtime, and Java Platform Module System Support.

https://opensearch.org/blog/unveiling-opensearch-3-0

Adobe announces updated Firefly

Adobe updated Firefly, their all-in-one app for AI-assisted content ideation, creation and production. Firefly empowers creators to generate images, video, audio and vectors from a single place with creative control, iterate on their creations across Adobe’s creative apps and seamlessly deliver them into production. The choice of partner models includes Google Cloud and OpenAI and AI-powered tools deeply integrated into Creative Cloud apps.

Firefly includes all of Adobe’s commercially safe creative AI models, including the new Firefly Image Model 4 for lifelike images, the new Firefly Image Model 4 Ultra for detail and complexity and the Firefly Video Model which generates footage from text prompts and images with creative control.

Firefly also provides creative professionals with the choice to explore in different aesthetic styles using models from partners, with Google Cloud and OpenAI models available today and models from partners including fal .ai, Ideogram, Luma, Pika and Runway available in the coming months. The all-new Firefly Boards, now in public beta in Firefly, gives creators an AI-first surface for moodboarding, exploring creative concepts, iterating on hundreds of variations at once and collaborating on ideation. 

Creators can use Firefly on the web today, with the mobile app coming soon.

https://www.adobe.com/ai/overview/firefly/gen-ai-approach.html

Elastic’s Cloud Serverless now on Google Cloud Marketplace

From the Elastic blog…

Today, we are excited to announce the general availability of Elastic Cloud Serverless on Google Cloud — now available in the Iowa (us-central1) region. Elastic Cloud Serverless provides the fast way to start and scale observability, security, and search solutions without managing infrastructure. Built on the Search AI Lake architecture, which leverages Google Cloud Storage, it combines storage, separate storage and compute, low-latency querying, and advanced AI capabilities.

  • No compromise on speed or scale: Elasticsearch Serverless dynamically scales to accommodate your workload, handling unpredictable traffic spikes automatically — all while delivering low-latency search on boundless object storage.
  • Hassle-free operations: Say goodbye to managing clusters, provisioning nodes, or fine-tuning performance. Free your team from operational tasks — no need to manage infrastructure, do capacity planning, upgrade, or scale data. 
  • Purpose-built product experience: Elastic Cloud Serverless offers a streamlined workflow to help you create projects tailored to your unique use cases in observability, security, and search. With guided onboarding, you can use in-product resources and tools that guide you every step of the way, accelerating time to value.
  • Flexible usage-based pricing model: Elastic Cloud Serverless offers a usage-based pricing model that scales with your needs.

https://www.elastic.co/blog/elastic-cloud-serverless-google-cloud-general-availability

Sitecore launches AI Innovation Lab

Sitecore, a provider of digital experience and content management software, launched Sitecore AI Innovation Lab, a program created in collaboration with Microsoft that provides a guided environment for marketing professionals to rapidly explore AI-driven solutions for optimizing content operations. It helps marketers define their AI journey and fast track the development of solutions best suited for their specific use cases.

A Sitecore AI Innovation Lab guided environment explores AI-driven solutions with Microsoft Azure and Azure OpenAI Services. Participants work alongside Sitecore and Microsoft experts to prototype solutions for their unique challenges using an agile, low risk approach. The result is either a validated AI solution or learnings to help marketers achieve their business objectives. AI innovations developed with customers will be integrated into Sitecore’s DXP to further enhance, improve and future-proof the platform.

Sitecore recently released more than 250 enhancements for its composable DXP to help marketers create, build, and optimize digital experiences and content that align with their brand identity and drive engagement and conversions. Sitecore Stream helps marketers meet evolving customer expectations by speeding the creation and delivery of personalized digital experiences across all touchpoints.

The Sitecore AI Innovation Lab is available to current Sitecore customers.

https://www.sitecore.com/products/ai-innovation-lab

Altair partners with Databricks

Altair, a global provider in computational intelligence, has partnered with Databricks, a data and AI company, to provide joint customers with capabilities for data unification, graph-powered intelligence, and enterprise-grade artificial intelligence (AI).

Customers can use Altair RapidMiner to access, prepare, and analyze data in Databricks without data duplication. The platform’s full-stack AI capabilities—from low-code AutoML to MLOps, agent frameworks, and high-speed visualization, allow organizations to prototype, deploy, and scale AI applications using data stored in Databricks.

A key differentiator is Altair RapidMiner’s massively parallel processing (MPP) knowledge graph technology, purpose-built to support knowledge graph creation, data fabrics, and ontology modeling at enterprise scale. By integrating with Databricks, customers can use the Altair RapidMiner knowledge graph engine to connect, contextualize, and activate all types of data—structured, unstructured, and streaming. These graph-powered fabrics form the foundation for a new generation of intelligent systems, enabling generative AI models and autonomous agents to navigate the full complexity of an organization’s digital operations.

Altair RapidMiner also offers native support for SAS language execution, allowing customers to preserve and extend the value of their existing analytics investments while modernizing their workflows.

https://altair.com/altair-rapidminer ■ https://www.databricks.com

Introducing Aura-2: Enterprise-grade text-to-speech

Deepgram, a voice AI platform for enterprise use cases, announced Aura‑2, its next-generation text-to-speech (TTS) model purpose-built for real-time voice applications in mission-critical business environments. Engineered for clarity, consistency, low-latency performance, and deployable via cloud or on-premises APIs, Aura‑2 enables developers to build scalable, human-like voice experiences for automated interactions across the enterprise, including customer support, virtual agents, and AI-powered assistants. Aura-2 is built on Deepgram Enterprise Runtime—the infrastructure that powers the company’s speech-to-text (STT) and speech-to-speech (STS) capabilities, providing enterprises with the control, adaptability, and performance required to deploy and scale production-grade voice AI.

A significant gap exists between entertainment-focused models and the operational demands of enterprise-grade voice systems. Entertainment-focused TTS platforms are trained on and optimized for storytelling, character voices, and emotionally expressive delivery. Enterprise applications require more than natural-sounding voices, they demand domain-specific pronunciation, a professional tone, consistent contextual handling, and the ability to perform reliably, cost-effectively, and securely, often in environments that require full deployment control.

Aura-2 is powered by Deepgram Enterprise Runtime (DER)—a custom-built infrastructure layer that runs all of Deepgram’s speech models. Designed specifically for enterprise-grade performance, DER orchestrates voice AI in real time with the speed, reliability, and adaptability required for production-scale deployments.

Introducing Aura-2: Enterprise-Grade Text-to-Speech

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑