Algolia launched Algolia NeuralSearch, their next-generation vector and keyword search in a single API. Algolia NeuralSearch understands natural language and delivers results in milliseconds. Algolia NeuralSearch uses Large Language Models (LLM) and goes further with Algolia’s Neural Hashing for hyper-scale, and constantly learns from user interactions.
Algolia NeuralSearch analyzes the relationships between words and concepts, generating vector representations that capture their meaning. Because vector-based understanding and retrieval is combined with Algolia’s full-text keyword engine, it works for exact matching too. Algolia NeuralSearch addresses the limitation in neural search to scale with their Neural Hashing, which compresses search vectors.
Algolia incorporates AI across three primary functions: query understanding, query retrieval, and ranking of results.
- Query understanding – Algolia’s advanced natural language understanding (NLU) and AI-driven vector search provide free-form natural language expression understanding and AI-powered query categorization that prepares and structures a query for analysis. Adaptive Learning based on user feedback fine-tunes intent understanding.
- Retrieval – The retrieval process merges the Neural Hashing results in parallel with keywords using the same index for easy retrieval and ranking.
- Ranking – The best results are pushed to the top by Algolia’s AI-powered Re-ranking, which takes into account the many signals attached to the search query.