Curated for content, computing, and digital experience professionals

Category: Enterprise search & search technology (Page 6 of 60)

Research, analysis, and news about enterprise search and search markets, technologies, practices, and strategies, such as semantic search, intranet collaboration and workplace, ecommerce and other applications.

Before we consolidated our blogs, industry veteran Lynda Moulton authored our popular enterprise search blog. This category includes all her posts and other enterprise search news and analysis. Lynda’s loyal readers can find all of Lynda’s posts collected here.

For older, long form reports, papers, and research on these topics see our Resources page.

Algolia launches AI-powered Algolia NeuralSearch

Algolia launched Algolia NeuralSearch, their next-generation vector and keyword search in a single API. Algolia NeuralSearch understands natural language and delivers results in milliseconds. Algolia NeuralSearch uses Large Language Models (LLM) and goes further with Algolia’s Neural Hashing for hyper-scale, and constantly learns from user interactions.

Algolia NeuralSearch analyzes the relationships between words and concepts, generating vector representations that capture their meaning. Because vector-based understanding and retrieval is combined with Algolia’s full-text keyword engine, it works for exact matching too. Algolia NeuralSearch addresses the limitation in neural search to scale with their Neural Hashing, which compresses search vectors.

Algolia incorporates AI across three primary functions: query understanding, query retrieval, and ranking of results.

  • Query understanding – Algolia’s advanced natural language understanding (NLU) and AI-driven vector search provide free-form natural language expression understanding and AI-powered query categorization that prepares and structures a query for analysis. Adaptive Learning based on user feedback fine-tunes intent understanding.
  • Retrieval – The retrieval process merges the Neural Hashing results in parallel with keywords using the same index for easy retrieval and ranking.
  • Ranking – The best results are pushed to the top by Algolia’s AI-powered Re-ranking, which takes into account the many signals attached to the search query.

https://www.algolia.com/products/neuralsearch/

Algolia introduces developer-friendly plan

Algolia, a AI Search and Discovery platform, evolved its pricing and packaging to be more developer-friendly with the introduction of two new developer-oriented plans: a “Build” plan that is free and a “Grow” plan that offers easy scalability at affordable prices. The new Build plan increases the number of free records that a developer can store in Algolia from 10,000 to now 1 million records. Additionally, Algolia cut the cost of search requests in its Grow plan by 50% and records by 60%.

Algolia’s “Build” pricing plan provides developers with free access to the entire set of capabilities in its AI-powered Search and Discovery platform. The company’s “Grow” plan, for when a developer is ready to scale their application, enables developers with more developer-friendly usage-based pricing for live production settings.

A designer, creator, or builder, whether they are a casual or fully committed software engineer, can access all the tools, documentation, sample code, educational content, and cross-platform integration capabilities needed to get started with managing their data, building a search front-end, configuring analytics for free. They will have access to a developer community of more than 5 million builders. Algolia pricing and packaging reflecting this change is immediately available.

https://www.algolia.com

Slang Labs launches CONVA

Slang Labs, a Google-backed startup from Bengaluru, announced the launch of CONVA, a full-stack solution that provides smart and highly accurate multilingual voice search capabilities inside e-commerce apps. CONVA is available as a simple SDK (Software Development Kit) that can be integrated into existing e-commerce apps in less than 30 minutes without developers needing any knowledge of Automatic Speech Recognition (ASR), Natural language processing (NLP), Text-to-Speech (TTS) and other advanced voice tech stack concepts.

CONVA-powered voice search comprehends mixed-code (multiple languages in one sentence) utterances, enabling consumers to speak naturally in their own language in order to search for products and information inside e-commerce mobile and web apps – while allowing the brand to maintain its app backend in only one language i.e. English. For instance, when people use English and another vernacular language within the same sentence for searching for something, CONVA will understand both languages and provide a seamless search experience to the consumer.

Customers can search for products inside the applications using their typical colloquial terms for well-known products using voice search that is enabled by CONVA, and the apps will still be able to recognise the correct product being searched.

https://www.slanglabs.in/media

Weaviate releases generative search module

Weaviate announced the release of a generative search module for OpenAI’s GPT-3, and other generative AI models (Cohere, LaMDA) to follow. The module allows Weaviate users and customers to integrate with those models and eliminates hurdles that currently limit the utility of such models in business use cases.

Generative models have so far been limited by a centralized and generic knowledge base that leaves them unable to answer business-specific questions. Weaviate’s generative module removes this limitation by allowing users to specify that the model work from users’ own Weaviate vector database. The solution combines language abilities like those of ChatGPT with a vector database that is relevant, secure, updated in real time, and less prone to hallucination.

Weaviate’s open-source generative AI module is now available to download. The new model also integrates with the company’s SaaS and hybrid SaaS products for use by clients with service-level agreements.

The Weaviate vector-search engine is a “third wave” database technology. Data is processed by a machine learning model first, and AI models help process, store, and search through it. As a result, Weaviate is not limited to natural language; Weaviate can also search images, audio, video, or even genetic information.

https://weaviate.io

Zeta Alpha integrates GPT with its semantic neural engine

Zeta Alpha, a neural search and discovery platform, announced they have integrated with OpenAI’s GPT with its semantic neural search engine, to provide more reliable and explainable AI generated answers to enterprise search queries. This capability gives workers the ability to leverage GPT to access knowledge hidden in troves of internal company data.

Generative AI models like GPT tend to ‘hallucinate,’ or give answers that seem plausible, but are not factually correct. This prevents organizations from adopting AI tools for enterprise search and knowledge management. The combination of Zeta Alpha’s intelligent neural search engine and advances in GPT-3 reduce this problem by applying natural language understanding. Other enhancements include:

  • InPars v2, a GPT-powered neural search model that enables fast tuning on synthetic in-domain data without the cost of creating terminology lists and taxonomies.
  • Zeta Alpha enables users to ask a question and get contextually relevant results, automatically saving text to a spreadsheet or note for further analysis, and mapping back to the location where the document is saved for future access.
  • Visualizing the information landscape in a semantic map and interpreting it with summaries by GPT can guide knowledge workers in the right direction to answer important strategic questions.

https://www.zeta-alpha.com/

Uniform and Algolia partner

Uniform announced a partnership with Algolia. With the new integration, marketers and merchandisers can query Agolia within Uniform Canvas to show either for specific results, such as blog entries or products, or a dynamic range based on a search query. This process uses Algolia to deliver results quickly and can be used with any connected index within Algolia. These results can also be used for testing and personalization using Uniform Context. 

In practice, this means that a Canvas component can be connected to an Algolia index built from a commerce engine, then set to search for either alternative products (e.g., showing alternative brown leather shoes when viewing a product detail page) or complementary products to drive cross-sales (e.g., Brown belts and bags to match the shoes being viewed.) As new products are added to the index, the component automatically adds the most relevant options, to boost conversions without manual work. This can be used to recommend content or tutorials to support marketing use cases. 

With Algolia’s prebuilt front-end component, business users can control the display of search results (pagination, columns, other page content), also the display of the images in the Algolia index without developer support.

https://uniform.dev/https://algolia.com

Sinequa updates search cloud platform

Sinequa announced the 11.9 release of its Search Cloud Platform. The new update improves on Sinequa’s Neural Search by expanding integration options with new pre-built connectors, and enhancing its tools for delivering intelligent search applications. New features and enhancements include:

  • Better relevance: Enhanced document splitting and automatic content management for higher-quality passages and augmented answer extraction capabilities
  • Better experience: UI/UX enhancements, including built-in components for Top Passages and Answers
  • More languages: Initially available in English, Neural Search now supports French, German, and Spanish as well
  • PTC Windchill connector for PLM content integration has been upgraded to support v12, including all object types and all security protocols.
  • Office 365 OneNote. The addition of OneNote means Sinequa now has off-the-shelf connectors for the full suite of Office 365 applications.
  • Audit Log: A built-in connector replaces the previous one with more capabilities like event filtering.
  • Azure Blob Storage for full indexing of Azure content and simplified role management.
  • Sinequa Themes. A modernized UI with modular themes that can be easily added, removed, and customized.
  • Search Starter App Builder now includes additional components, options, and a more intuitive UI for instantly delivering a best-in-class search experience.

https://www.sinequa.com

Pinecone launches hybrid search functionality

Pinecone Systems Inc., a machine learning (ML) search infrastructure company, announced the release of a keyword-aware semantic search solution that enables accessible and advanced combination of semantic and keyword search results. “Vector search” allows companies to provide relevant results based on semantic, or similar meanings, as opposed to simple keyword-based searches. At the same time, keywords still matter in searches involving uncommon words like names or industry-specific terms. With few exceptions, companies have to choose between semantic search and keyword search, or running both systems in parallel.

Neither of these options is ideal. When companies choose one or the other, the results are not as complete as they could be, and when they run both systems in parallel and try to combine the results, cost and complexity goes up significantly. This technology can search across two data types — “dense vectors” generated by ML models to represent meaning, and “sparse vectors” generated by traditional keyword-ranking models such as BM25 — before automatically fusing everything into one ranked list of the most relevant results. The Pinecone hybrid search feature is available in beta.

https://www.pinecone.io/hybrid-search-early-access

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑