Enterprise search provider Sinequa announced the addition of advanced neural search capabilities to its Search Cloud Platform, for better relevance and accuracy to enterprises. As an optional capability of Sinequa’s Search Cloud platform, Neural Search uses four deep learning language models. These models are pre-trained and ready to use in combination with Sinequa’s Natural Language Processing (NLP) and semantic search.

Sinequa optimized the models and collaborated with the Microsoft Azure and NVIDIA AI/ML teams to deliver a high performance, cost-efficient infrastructure to support intensive Neural Search workloads without a huge carbon footprint. Neural Search is optimized for Microsoft Azure and the latest NVIDIA A10 or A100 Tensor Core GPUs to efficiently process large amounts of unstructured data as well as user queries.

Sinequa’s Neural Search improves relevance and is often able to directly answer natural language questions. It does this with deep neural nets that go beyond word-based search to better leverage meaning and context. Sinequa’s Search Cloud platform combines neural search with its extensive NLP and statistical search. This unified approach provides more accurate and comprehensive search results across a broader range of content and use cases.

https://www.sinequa.com/product-enterprise-search/neural-search/