Docugami, a document engineering company that transforms how businesses create and execute critical business documents, announced an initial integration of LlamaIndex with Docugami, via the Llama Hub.
The LlamaIndex framework provides a flexible interface between a user’s information and Large Language Models (LLMs). Coupling LlamaIndex with Docugami’s ability to generate a Document XML Knowledge Graph representation of long-form Business Documents opens opportunities for LlamaIndex developers to build LLM applications that connect users to their own Business Documents, without being limited by document size or context window restrictions.
General purpose LLMs alone cannot deliver the accuracy needed for business, financial, legal, and scientific settings because they are trained on the public internet, which introduces a wide range of irrelevant and low-quality source materials. By contrast, Docugami is trained exclusively for business scenarios, for greater accuracy and reliability.
Systems aiming to understand the content of documents, such as retrieval and question-answering, will benefit from Docugami’s semantic Document XML Knowledge Graph Representation. Our unique approach to document chunking allows for better understanding and processing of your documents
AI-powered search provider Sinequa has announced domain-specific enhancements to its intelligent search platform for Scientific Search and Clinical Trial Data. Its search platform now utilizes new Neural Search and ChatGPT capabilities for faster, more effective discovery and decisions in drug development and clinical research. Sinequa will present these capabilities at the 2023 Bio-IT World Conference, May 16-18, at the Boston Convention and Exhibition Center, during conference sessions and at booth #803 in Auditorium Hall C.
Combining the capabilities of Sinequa Neural Search – multiple deep learning and large language models for natural language understanding (NLU) – with the latest ChatGPT models through Azure OpenAI Service, Sinequa enables accurate, fast, traceable semantic search, insight generation, and summarization. Users can query and converse with a secure corpus of data, including proprietary life science systems, enterprise collaboration systems, and external data sources, to answer complex and nuanced questions. Comprehensive search results with high relevance and the ability to generate concise summaries enhance R&D intelligence, optimize clinical trials, and streamline regulatory workflows.
Algolia launched Algolia NeuralSearch, their next-generation vector and keyword search in a single API. Algolia NeuralSearch understands natural language and delivers results in milliseconds. Algolia NeuralSearch uses Large Language Models (LLM) and goes further with Algolia’s Neural Hashing for hyper-scale, and constantly learns from user interactions.
Algolia NeuralSearch analyzes the relationships between words and concepts, generating vector representations that capture their meaning. Because vector-based understanding and retrieval is combined with Algolia’s full-text keyword engine, it works for exact matching too. Algolia NeuralSearch addresses the limitation in neural search to scale with their Neural Hashing, which compresses search vectors.
Algolia incorporates AI across three primary functions: query understanding, query retrieval, and ranking of results.
- Query understanding – Algolia’s advanced natural language understanding (NLU) and AI-driven vector search provide free-form natural language expression understanding and AI-powered query categorization that prepares and structures a query for analysis. Adaptive Learning based on user feedback fine-tunes intent understanding.
- Retrieval – The retrieval process merges the Neural Hashing results in parallel with keywords using the same index for easy retrieval and ranking.
- Ranking – The best results are pushed to the top by Algolia’s AI-powered Re-ranking, which takes into account the many signals attached to the search query.
Algolia, a AI Search and Discovery platform, evolved its pricing and packaging to be more developer-friendly with the introduction of two new developer-oriented plans: a “Build” plan that is free and a “Grow” plan that offers easy scalability at affordable prices. The new Build plan increases the number of free records that a developer can store in Algolia from 10,000 to now 1 million records. Additionally, Algolia cut the cost of search requests in its Grow plan by 50% and records by 60%.
Algolia’s “Build” pricing plan provides developers with free access to the entire set of capabilities in its AI-powered Search and Discovery platform. The company’s “Grow” plan, for when a developer is ready to scale their application, enables developers with more developer-friendly usage-based pricing for live production settings.
A designer, creator, or builder, whether they are a casual or fully committed software engineer, can access all the tools, documentation, sample code, educational content, and cross-platform integration capabilities needed to get started with managing their data, building a search front-end, configuring analytics for free. They will have access to a developer community of more than 5 million builders. Algolia pricing and packaging reflecting this change is immediately available.
Slang Labs, a Google-backed startup from Bengaluru, announced the launch of CONVA, a full-stack solution that provides smart and highly accurate multilingual voice search capabilities inside e-commerce apps. CONVA is available as a simple SDK (Software Development Kit) that can be integrated into existing e-commerce apps in less than 30 minutes without developers needing any knowledge of Automatic Speech Recognition (ASR), Natural language processing (NLP), Text-to-Speech (TTS) and other advanced voice tech stack concepts.
CONVA-powered voice search comprehends mixed-code (multiple languages in one sentence) utterances, enabling consumers to speak naturally in their own language in order to search for products and information inside e-commerce mobile and web apps – while allowing the brand to maintain its app backend in only one language i.e. English. For instance, when people use English and another vernacular language within the same sentence for searching for something, CONVA will understand both languages and provide a seamless search experience to the consumer.
Customers can search for products inside the applications using their typical colloquial terms for well-known products using voice search that is enabled by CONVA, and the apps will still be able to recognise the correct product being searched.
Weaviate announced the release of a generative search module for OpenAI’s GPT-3, and other generative AI models (Cohere, LaMDA) to follow. The module allows Weaviate users and customers to integrate with those models and eliminates hurdles that currently limit the utility of such models in business use cases.
Generative models have so far been limited by a centralized and generic knowledge base that leaves them unable to answer business-specific questions. Weaviate’s generative module removes this limitation by allowing users to specify that the model work from users’ own Weaviate vector database. The solution combines language abilities like those of ChatGPT with a vector database that is relevant, secure, updated in real time, and less prone to hallucination.
Weaviate’s open-source generative AI module is now available to download. The new model also integrates with the company’s SaaS and hybrid SaaS products for use by clients with service-level agreements.
The Weaviate vector-search engine is a “third wave” database technology. Data is processed by a machine learning model first, and AI models help process, store, and search through it. As a result, Weaviate is not limited to natural language; Weaviate can also search images, audio, video, or even genetic information.
Zeta Alpha, a neural search and discovery platform, announced they have integrated with OpenAI’s GPT with its semantic neural search engine, to provide more reliable and explainable AI generated answers to enterprise search queries. This capability gives workers the ability to leverage GPT to access knowledge hidden in troves of internal company data.
Generative AI models like GPT tend to ‘hallucinate,’ or give answers that seem plausible, but are not factually correct. This prevents organizations from adopting AI tools for enterprise search and knowledge management. The combination of Zeta Alpha’s intelligent neural search engine and advances in GPT-3 reduce this problem by applying natural language understanding. Other enhancements include:
- InPars v2, a GPT-powered neural search model that enables fast tuning on synthetic in-domain data without the cost of creating terminology lists and taxonomies.
- Zeta Alpha enables users to ask a question and get contextually relevant results, automatically saving text to a spreadsheet or note for further analysis, and mapping back to the location where the document is saved for future access.
- Visualizing the information landscape in a semantic map and interpreting it with summaries by GPT can guide knowledge workers in the right direction to answer important strategic questions.
Uniform announced a partnership with Algolia. With the new integration, marketers and merchandisers can query Agolia within Uniform Canvas to show either for specific results, such as blog entries or products, or a dynamic range based on a search query. This process uses Algolia to deliver results quickly and can be used with any connected index within Algolia. These results can also be used for testing and personalization using Uniform Context.
In practice, this means that a Canvas component can be connected to an Algolia index built from a commerce engine, then set to search for either alternative products (e.g., showing alternative brown leather shoes when viewing a product detail page) or complementary products to drive cross-sales (e.g., Brown belts and bags to match the shoes being viewed.) As new products are added to the index, the component automatically adds the most relevant options, to boost conversions without manual work. This can be used to recommend content or tutorials to support marketing use cases.
With Algolia’s prebuilt front-end component, business users can control the display of search results (pagination, columns, other page content), also the display of the images in the Algolia index without developer support.
https://uniform.dev/ ■ https://algolia.com