Curated for content, computing, and digital experience professionals

Tag: Natural language processing (NLP) (Page 1 of 2)

Serviceaide introduces Luma Knowledge

Serviceaide, Inc., a provider of intelligent, enterprise service management solutions, announced the launch of Luma Knowledge, a self-learning, knowledge-centered product that optimizes access, creation and reuse of enterprise knowledge to service and support needs of users and customers. The maker of the AI-powered Luma Virtual Agent, Serviceaide is leveraging AI technologies like natural language processing and machine learning in digital interactions, knowledge and automation to bring advanced capabilities and business value to service and support functions across the enterprise. Features and capabilities:

  • The Luma Knowledge hub provides a common tool to actively correlate and access information federated across the enterprise.
  • Luma Knowledge offers a common semantic pathway to all enterprise knowledge.
  • Natural Language Processing auto extracts topics and pulls text from complex documents to auto-create FAQs. 
  • A dynamic guided search capability, based on available knowledge, helps users access the right information even when they don’t know exactly what to ask for, and don’t know what is in the knowledge base.
  • Automated learning leverages machine learning to auto-tune retrievals and identify missing content or other related issues.
  • Knowledge Sharing – Federating across multiple knowledge bases, semantic searches and guiding requests deliver accurate knowledge
  • Knowledge Discovery – Proactively discovering knowledge both inside an organization and from external sources
  • Knowledge Improvement – Continuous monitoring of knowledge and feedback to provide recommendations for needed knowledge, correcting knowledge and searches, and retiring unused knowledge

Semantic Web Company and Ontotext partner to advance enterprise knowledge graphs

Ontotext (OT) and Semantic Web Company (SWC) announced a strategic partnership to meet the requirements of enterprise architects such as deployment, monitoring, resilience, and interoperability with other enterprise IT systems and security. Users will be able to work with a feature-rich toolset to manage a graph composed of billions of edges that is hosted in data centers around the world. The companies have implemented an integration of the PoolParty Semantic SuiteTM v.8 with the GraphDB and Ontotext Platform, which offers benefits for numerous use cases:

  • GraphDB powering PoolParty: Most of the knowledge graph management tools out there bundle open-source solutions that are good at managing thousands of concepts, whereas PoolParty bundled with GraphDB manages millions of concepts and entities—without extra deployment overheads.
  • PoolParty linked to high-availability GraphDB cluster: GraphDB can now be used as an external store for PoolParty, which offers a combination of performance, scalability and resilience. This is particularly relevant for organizations intent on developing tailor-made knowledge graph platforms integrated into their existing data and content management infrastructure.
  • Dynamic text analysis using big knowledge graphs: PoolParty can be used to edit big knowledge graphs in order to tune the behavior of Ontotext’s text analysis pipelines, which employ vast amounts of domain knowledge to boost precision. This way the power and comprehensiveness of generic off-the-shelf natural language processing (NLP) pipelines can be custom-tailored to an enterprise.
  • GraphQL benefits for PoolParty: Application developers can now access the knowledge graph via GraphQL to build end-user applications or integrate knowledge graph services with the functionality of existing systems. Ontotext Platform uses semantic business objects, defined by subject matter experts and business analysts, to generate GraphQL interfaces and transform them into SPARQL.,

DRE’s DOC Analytics generates network meta-analysis with natural language question search

Doctor Evidence (DRE) has updated their newly launched DOC Analytics (“Digital Outcome Conversion”) platform with network meta-analysis (NMA) capabilities. DOC Analytics provides immediate quantitative insights into the universe of medical information using artificial intelligence/machine learning (AI/ML) and natural language processing (NLP). With the addition of indirect treatment comparison and landscape analysis using NMA, DOC Analytics is a critical, daily-use tool for strategic functions in life sciences companies. DOC Analytics allows users to conduct analyses comprised of real-time results from clinical trials, real-world evidence (RWE), published literature, and any custom imported data to yield insightful direct meta-analysis, network-meta analysis, cohort analysis, or bespoke statistical outputs. Analyses are informed by AI/ML and can be made fit-to-purpose with filters for demographics, comorbidities, sub-populations, inclusion/exclusion selections, and other relevant parameters.

OpenAI releases API for a general-purpose “text in, text out” interface

OpenAI API announce they were releasing an API for accessing new AI models developed by OpenAI. Unlike most AI systems which are designed for one use-case, the API today provides a general-purpose “text in, text out” interface, allowing users to try it on virtually any English language task. You can now request access in order to integrate the API into your product, develop an entirely new application, or help us explore the strengths and limits of this technology. Given any text prompt, the API will return a text completion, attempting to match the pattern you gave it. You can “program” it by showing it just a few examples of what you’d like it to do; its success generally varies depending on how complex the task is. The API also allows you to hone performance on specific tasks by training on a dataset (small or large) of examples you provide, or by learning from human feedback provided by users or labelers. The API is designed to be both simple for anyone to use but also flexible enough to make machine learning teams more productive. In fact, many OpenAI teams are now using the API so that they can focus on machine learning research rather than distributed systems problems. Today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.

The field’s pace of progress means that there are frequently surprising new applications of AI, both positive and negative. We will terminate API access for obviously harmful use-cases, such as harassment, spam, radicalization, or astroturfing. But we also know we can’t anticipate all of the possible consequences of this technology, so we are launching today in a private beta rather than general availability, building tools to help users better control the content our API returns, and researching safety-relevant aspects of language technology (such as analyzing, mitigating, and intervening on harmful bias). We’ll share what we learn so that our users and the broader community can build more human-positive AI systems.

SDL partners with DRUID for multilingual chatbot conversations

SDL announced it entered into a technical partnership with DRUID, specialists in conversational AI, to launch multi-lingual virtual assistants for enterprise organizations that enable real-time communication through chatbots. By integrating SDL Machine Translation with DRUID virtual assistants, companies will be able to conduct chatbot conversations in different languages with employees, customers, partners and suppliers. The solution offers a real-time “interpreter mode” function, which can translate conversations along with “live chat” which can translate into multiple languages. 

Chatbots are commonly configured to undergo complicated question-and-answering activities in different languages, but language-specific customization can be complex, time-consuming and costly. The issue becomes even more complex when a chatbot is connected to various data sources (ERP, CRM, BI, HRIS, or other types of business applications). With SDL Machine Translation, chatbots can converse in multiple languages without the need to translate data sources or conversational flows.

SDL Machine Translation provides the neural machine (NMT2.0) foundationand the combined solution includes the ability to control brand voice with a brand-specific terminology dictionary that contains company-specific product names and unique terminology. This is machine learning solution uses anonymized chat logs for continuous language model improvement.,

« Older posts

© 2020 The Gilbane Advisor

Theme by Anders NorenUp ↑