Sinequa announced the availability of Sinequa Assistants; enterprise generative AI assistants that integrate with enterprise content and applications to augment and transform knowledge work. Sinequa’s Neural Search complements GenAI and provides the foundation for Sinequa’s Assistants. Its capabilities go beyond RAG’s conventional search-and-summarize paradigm to intelligently execute complex, multi-step activities, all grounded in facts to augment the way employees work.

Sinequa’s Assistants leverage all company content and knowledge to generate contextually-relevant insights and recommendations. Optimized for scale with three custom-trained small language models (SLMs), Sinequa Assistants help ensure accurate conversational responses on any internal topic, complete with citations and traceability to the original source.

Sinequa Assistants work with any public or private generative LLM, including Cohere, OpenAI, Google Gemini, Microsoft Azure Open AI, and Mistral. The Sinequa Assistant framework includes ready-to-go Assistants along with tools to define custom Assistant workflows so that customers can use an Assistant out of the box, or tailor and manage multiple Assistants from a single platform. These Assistants can be tailored to fit the needs of specific business scenarios and deployed and updated quickly without code or additional infrastructure. Domain-specific assistants scientists, engineers, lawyers, financial asset managers and others are available.

https://www.sinequa.com/company/press/sinequa-augments-companies-with-release-of-new-generative-ai-assistants