Atlan, an active metadata platform, has launched Tag Management, accelerating the shift of data governance to the left.
As the modern data stack continues to evolve, data teams are faced with the challenge of ensuring the right people have the right access to the right data. Data teams need to ensure that they can confidently identify sensitive data across their data stack and protect it with the right access controls, while serving trusted data to data consumers.
Atlan’s Tag Management is a new way for data teams to manage data access across the modern data stack. Tags are important metadata that can be assigned to data assets to monitor sensitive data for compliance, discovery, and protection use cases.
For data teams that have tag-based access control built into their Snowflake Data Cloud, Atlan can now become the control plane for access control management. Once a data producer tags a data asset in Atlan or Snowflake, data teams can rest assured that the data asset is protected across the data ecosystem.
Expert.ai unveiled its “Enterprise Language Model for Insurance” – ELMI, a domain trained language model, to help insurers reach their process automation and digital transformation goals with the highest accuracy. By simplifying and powering the interaction with language data within the expert.ai Platform for Insurance, Insurers can access solutions that scale and take advantage of deep insurance domain expertise combined with the best and most cost-effective attributes of Large Language Models (LLMs) to automate core processes.
Through the expert.ai Platform for Insurance, ELMI supports key capabilities, including:
- Generative Summarization: generate accurate summaries, condensing vast amounts of claim or policy information into concise insights, saving time and accelerating straight through processing or human review activities.
- Zero Shot Extraction: extract crucial insurance data from structured/unstructured, handwritten/typed, good quality/bad quality sources with accuracy and automatically normalize output formats and add medical annotations such as ICD 9/10 medical codes.
- Generative Q&A: answer questions quickly so underwriters and claims handlers can extract meaningful insights from proprietary case files by asking questions using natural language queries.
- Cloud-agnostic: Offering the flexibility to deploy on any cloud infrastructure or on-premises, ELMI deployments easily meet insurer’s varying requirements.
Acquia, a digital experience technology provider, announced the launch of an integration hub, Acquia Exchange, to enhance the flexibility and extensibility of its digital experience platform. Acquia Exchange offers customers a single destination to discover integrations, connectors, and modules that enhance Acquia solutions and connect them to those of other technology providers. This intuitive hub makes it easier for organizations to extend their digital experience platform (DXP) using technology from Acquia’s ecosystem of SaaS partners.
As an open and composable DXP, Acquia allows for integrations with a wide range of third-party marketing, sales, and digital technologies. This enables customers to build productive digital experiences that meet the precise requirements of their own audiences. Now, with Acquia Exchange, customers can explore integrations by company name, technology category, or Acquia product.
Acquia Exchange helps customers discover three different types of integrations:
- Native – Developed and supported by Acquia.
- Partner – Developed and supported by an Acquia technology partner or a solution partner.
- Community – Developed and supported by a third-party vendor.
MongoDB announced new capabilities, performance improvements, and a data-streaming integration for MongoDB Atlas Vector Search.
Developers can more easily aggregate and filter data, improving semantic information retrieval and reducing hallucinations in AI-powered applications. With new performance improvements for MongoDB Atlas Vector Search, the time it takes to build indexes is reduced to help accelerate application development. Additionally, MongoDB Atlas Vector Search is now integrated with fully managed data streams from Confluent Cloud to make it easier to use real-time data from a variety of sources to power AI applications.
MongoDB Atlas Vector Search provides the functionality of a vector database integrated as part of a unified developer data platform, allowing teams to store and process vector embeddings alongside virtually any type of data to more quickly and easily build generative AI applications.
Amazon and Anthropic announced a strategic collaboration that will bring together their respective technology and expertise in safer generative artificial intelligence (AI) to accelerate the development of Anthropic’s future foundation models and make them widely accessible to AWS customers. As part of the expanded collaboration:
- Anthropic will use AWS Trainium and Inferentia chips to build, train, and deploy its future foundation models, benefitting from the price, performance, scale, and security of AWS.
- AWS will become Anthropic’s primary cloud provider for mission critical workloads, including safety research and future foundation model development. Anthropic plans to run the majority of its workloads on AWS.
- Anthropic makes a long-term commitment to provide AWS customers around the world with access to future generations of its foundation models via Amazon Bedrock. In addition, Anthropic will provide AWS customers with early access to unique features for model customization and fine-tuning capabilities.
- Amazon will invest up to $4 billion in Anthropic and have a minority ownership position in the company.
- Amazon developers and engineers will be able to build with Anthropic models via Amazon Bedrock so they can incorporate generative AI capabilities into their work.
https://press.aboutamazon.com/2023/9/amazon-and-anthropic-announce-strategic-collaboration-to-advance-generative-ai ■ https://www.anthropic.com
Contentful, a composable content platform for digital-first business, today announced that it has expanded its relationship with Amazon Web Services (AWS) to launch the Contentful Composable Content Platform in AWS Marketplace. AWS Marketplace is a digital catalog with thousands of software listings from independent software vendors that make it easy to find, test, buy, and deploy software and or services that run on AWS. With Contentful, powered by AWS, customers can build and orchestrate content experiences across all of their digital channels and platforms through AWS Marketplace.
Now, AWS customers can license Contentful through AWS Marketplace for seamless integration of Contentful with their existing cloud infrastructure, in addition to streamlined procurement, consolidated billing and cost-saving opportunities. This enables streamlined management and enhanced performance of digital content, empowering customers to deliver omni-channel digital experiences with increased speed and agility, and ensuring reliability and consistency to meet evolving consumer demands. Contentful, powered by AWS, allows organizations to build digital experiences by creating, managing, and delivering content across websites, mobile apps, and digital platforms.
StreamText, enterprise caption platform, announced the latest release of Automatic Speech Recognition (ASR) technology powered by artificial intelligence (AI). With the ability to create captions directly from an audio source, StreamText ASR features term glossaries to help finetune captioning AI for specific events and increase overall accuracy. The platform offers direct integrations with meeting software such as Zoom and Adobe Connect. It also supports over 50 source languages, including variants of English, French, and Spanish. While the quality of human captioning is often more accurate than AI counterparts, it may not always apply to all captioning needs. In these cases, StreamText ASR is a solution. ASR is useful in university settings, classrooms, government administration, and broadcast media.
Writer, a full-stack generative AI platform for enterprises, announced its Series B funding round of $100 million today. The round is being led by ICONIQ Growth with participation from WndrCo, Balderton Capital and Insight Partners, who led the Series A, and Aspect Ventures, who led the Seed. In addition, this round includes participation from several Writer customers such as Accenture and Vanguard.
The Series B funding will be used to further invest in the company’s own industry-specific large language models (LLMs), and to add agent and multimodal capabilities to its LLMs. Writer is built from the ground-up for the enterprise. It empowers the entire organization including support, operations, product, sales, HR, and marketing.
The platform includes Writer-built LLMs, Knowledge Graph to integrate with business data sources, and an application layer of chat interfaces, prebuilt templates, and composable UI options. Writer models can be self-hosted, which allows customers to get the security benefits of building their own model with the speed to value benefits of an end-to-end solution. Writer takes a full-stack approach that enables diverse use cases across the entire organization, not just solely on foundation models or an out-of-the-box app that only generates content.