Zeta Alpha, a neural search and discovery platform, announced they have integrated with OpenAI’s GPT with its semantic neural search engine, to provide more reliable and explainable AI generated answers to enterprise search queries. This capability gives workers the ability to leverage GPT to access knowledge hidden in troves of internal company data.
Generative AI models like GPT tend to ‘hallucinate,’ or give answers that seem plausible, but are not factually correct. This prevents organizations from adopting AI tools for enterprise search and knowledge management. The combination of Zeta Alpha’s intelligent neural search engine and advances in GPT-3 reduce this problem by applying natural language understanding. Other enhancements include:
- InPars v2, a GPT-powered neural search model that enables fast tuning on synthetic in-domain data without the cost of creating terminology lists and taxonomies.
- Zeta Alpha enables users to ask a question and get contextually relevant results, automatically saving text to a spreadsheet or note for further analysis, and mapping back to the location where the document is saved for future access.
- Visualizing the information landscape in a semantic map and interpreting it with summaries by GPT can guide knowledge workers in the right direction to answer important strategic questions.
Uniform announced a partnership with Algolia. With the new integration, marketers and merchandisers can query Agolia within Uniform Canvas to show either for specific results, such as blog entries or products, or a dynamic range based on a search query. This process uses Algolia to deliver results quickly and can be used with any connected index within Algolia. These results can also be used for testing and personalization using Uniform Context.
In practice, this means that a Canvas component can be connected to an Algolia index built from a commerce engine, then set to search for either alternative products (e.g., showing alternative brown leather shoes when viewing a product detail page) or complementary products to drive cross-sales (e.g., Brown belts and bags to match the shoes being viewed.) As new products are added to the index, the component automatically adds the most relevant options, to boost conversions without manual work. This can be used to recommend content or tutorials to support marketing use cases.
With Algolia’s prebuilt front-end component, business users can control the display of search results (pagination, columns, other page content), also the display of the images in the Algolia index without developer support.
https://uniform.dev/ ■ https://algolia.com
Sinequa announced the 11.9 release of its Search Cloud Platform. The new update improves on Sinequa’s Neural Search by expanding integration options with new pre-built connectors, and enhancing its tools for delivering intelligent search applications. New features and enhancements include:
- Better relevance: Enhanced document splitting and automatic content management for higher-quality passages and augmented answer extraction capabilities
- Better experience: UI/UX enhancements, including built-in components for Top Passages and Answers
- More languages: Initially available in English, Neural Search now supports French, German, and Spanish as well
- PTC Windchill connector for PLM content integration has been upgraded to support v12, including all object types and all security protocols.
- Office 365 OneNote. The addition of OneNote means Sinequa now has off-the-shelf connectors for the full suite of Office 365 applications.
- Audit Log: A built-in connector replaces the previous one with more capabilities like event filtering.
- Azure Blob Storage for full indexing of Azure content and simplified role management.
- Sinequa Themes. A modernized UI with modular themes that can be easily added, removed, and customized.
- Search Starter App Builder now includes additional components, options, and a more intuitive UI for instantly delivering a best-in-class search experience.
Pinecone Systems Inc., a machine learning (ML) search infrastructure company, announced the release of a keyword-aware semantic search solution that enables accessible and advanced combination of semantic and keyword search results. “Vector search” allows companies to provide relevant results based on semantic, or similar meanings, as opposed to simple keyword-based searches. At the same time, keywords still matter in searches involving uncommon words like names or industry-specific terms. With few exceptions, companies have to choose between semantic search and keyword search, or running both systems in parallel.
Neither of these options is ideal. When companies choose one or the other, the results are not as complete as they could be, and when they run both systems in parallel and try to combine the results, cost and complexity goes up significantly. This technology can search across two data types — “dense vectors” generated by ML models to represent meaning, and “sparse vectors” generated by traditional keyword-ranking models such as BM25 — before automatically fusing everything into one ranked list of the most relevant results. The Pinecone hybrid search feature is available in beta.
Elastic announced updates across the Elastic Search Platform, a data analytics platform for search-powered solutions, including:
- Simplifying the Elastic Cloud on AWS Experience Enabling customers to ingest data from any AWS service into Elastic Cloud on AWS directly from the AWS Marketplace with just three clicks.
- Improving search relevance with machine learning-based hybrid scoring
- Combining traditional keyword scoring with vector search scoring capabilities.
They also announced plans to develop stateless Elasticsearch, a new, fully cloud-native architecture. The stateless architecture will fully decouple compute and storage services, enabling customers to store and search all of their data in stateless object storage services such as Amazon S3, Azure Blob Store, and Google Cloud Storage.
A private beta version of a new Universal Profiling and additional synthetic monitoring capabilities to provide visibility into how application code and infrastructure are performing at all times, in production and across a wide range of languages, in both containerized and non-containerized environments was introduced, as well as a new managed testing infrastructure within Elastic Uptime to enable customers to schedule tests from a global network of testing agents for greater visibility into regional variances, also in beta.
Algolia, an API-First Search & Discovery Platform, announced the acquisition of Search.io, whose flagship product is Neuralsearch – a vector search engine that uses hashing technology on top of vectors to provide price performance at scale. Algolia will combine its keyword search and Search.io’s Neuralsearch into a single API-First Search and Discovery platform with a hybrid search engine, which comprises both keyword and semantic search in a single API.
The combination of Algolia (with its keyword search) and Search.io (with its vector-based semantic search), enables Algolia to more effectively surface the most accurate and relevant results for users, whether they use specific keywords or natural human expressions. Many companies claim to offer some form of semantic search, however, these companies may not offer the capabilities of keyword search and vector-based semantic search in a single API cost-effectively, or the ability to scale. In essence, Algolia provides users with the ability to search as they think. With Search.io, Algolia aims to empower business users with a better way to manage the automation of unique and engaging end user experiences.
Access Innovations, Inc., provider of Data Harmony software solutions, announced the release of their new Recommender as part of the Data Harmony Suite. Recommender is now available to all Data Harmony clients using versions 3.16 or higher.
Recommender uses the semantic fingerprint of an article, its subject metadata tagging, matching to other articles and content within the database. When the searcher finds an article they like, the Recommender automatically displays other items with the same semantic fingerprint nearby on the search interface. This allows immediate display of highly relevant content to the search without scrolling and frustration in trying to find similar items. It also allows for display of other relevant content such as conference papers, ads, books, meetings, expert profiles, and so forth.
This is not based on personalization profiles or purchasing history. By using the metadata weighting and other algorithms it provides only items relevant to the current query faster search and the surfacing of more related information to the user.
For those interested in using Recommender there are two prerequisites: 1) the content needs to be indexed or tagged using a controlled vocabulary like a thesaurus or taxonomy, and 2) the search interface needs to be able to accommodate the API call to the tagged data and subsequent display of the results.
Elastic, the company behind Elasticsearch, announced enhancements to its cross-cluster search and cross-cluster replication capabilities with interoperability between self-managed deployments and Elastic Cloud, now generally available. Customers can seamlessly search data across multiple Elasticsearch clusters deployed on-premises, on Kubernetes, and in the cloud.
Cross-cluster search enables users to search across multiple clusters and visualize data in one coherent view for deeper insights. Cross-cluster replication allows customers to replicate data between clusters regardless of their physical location (cloud or on-premises) and deployment model. Features include:
- Streamlining workflows with a single user interface to search and replicate data between Elasticsearch clusters regardless of environment—on-premises, public cloud, hybrid, and multi-cloud.
- Enabling customers to minimize risk and increase operational efficiency while retaining complete visibility of their data throughout the gradual migration of on-premises workloads to the cloud.
- Optimizing customers’ ability to troubleshoot production applications, analyze security events, and manage where their sensitive data resides.
- Improving disaster recovery scenarios where data redundancy and business continuity are critical, while increasing service resilience and lowering latency.