Doctor Evidence (DRE) has updated their newly launched DOC Analytics (“Digital Outcome Conversion”) platform with network meta-analysis (NMA) capabilities. DOC Analytics provides immediate quantitative insights into the universe of medical information using artificial intelligence/machine learning (AI/ML) and natural language processing (NLP). With the addition of indirect treatment comparison and landscape analysis using NMA, DOC Analytics is a critical, daily-use tool for strategic functions in life sciences companies. DOC Analytics allows users to conduct analyses comprised of real-time results from clinical trials, real-world evidence (RWE), published literature, and any custom imported data to yield insightful direct meta-analysis, network-meta analysis, cohort analysis, or bespoke statistical outputs. Analyses are informed by AI/ML and can be made fit-to-purpose with filters for demographics, comorbidities, sub-populations, inclusion/exclusion selections, and other relevant parameters.
Category: Computing & data (Page 76 of 91)
Computing and data is a broad category. Our coverage of computing is largely limited to software, and we are mostly focused on unstructured data, semi-structured data, or mixed data that includes structured data.
Topics include computing platforms, analytics, data science, data modeling, database technologies, machine learning / AI, Internet of Things (IoT), blockchain, augmented reality, bots, programming languages, natural language processing applications such as machine translation, and knowledge graphs.
Related categories: Semantic technologies, Web technologies & information standards, and Internet and platforms.
Cloudera announced the the premiere of Cloudera Data Platform Private Cloud (CDP Private Cloud). CDP Private Cloud is built for hybrid cloud, seamlessly connecting on-premises environments to public clouds with consistent, built-in security and governance. CDP Private Cloud, built on Red Hat OpenShift, is an enterprise data cloud that separates compute and storage for greater agility, ease of use, and more efficient use of private and public cloud infrastructure. Together, Red Hat OpenShift and CDP Private Cloud help create an essential hybrid, multi-cloud data architecture, enabling teams to rapidly onboard mission-critical applications and run them anywhere, without disrupting existing ones. Companies can now collect, enrich, report, serve and model enterprise data for any business use case in any cloud. CDP Private Cloud is in tech preview for select customers and is expected to be generally available later this summer.
OpenAI API announce they were releasing an API for accessing new AI models developed by OpenAI. Unlike most AI systems which are designed for one use-case, the API today provides a general-purpose “text in, text out” interface, allowing users to try it on virtually any English language task. You can now request access in order to integrate the API into your product, develop an entirely new application, or help us explore the strengths and limits of this technology. Given any text prompt, the API will return a text completion, attempting to match the pattern you gave it. You can “program” it by showing it just a few examples of what you’d like it to do; its success generally varies depending on how complex the task is. The API also allows you to hone performance on specific tasks by training on a dataset (small or large) of examples you provide, or by learning from human feedback provided by users or labelers. The API is designed to be both simple for anyone to use but also flexible enough to make machine learning teams more productive. In fact, many OpenAI teams are now using the API so that they can focus on machine learning research rather than distributed systems problems. Today the API runs models with weights from the GPT-3 family with many speed and throughput improvements.
The field’s pace of progress means that there are frequently surprising new applications of AI, both positive and negative. We will terminate API access for obviously harmful use-cases, such as harassment, spam, radicalization, or astroturfing. But we also know we can’t anticipate all of the possible consequences of this technology, so we are launching today in a private beta rather than general availability, building tools to help users better control the content our API returns, and researching safety-relevant aspects of language technology (such as analyzing, mitigating, and intervening on harmful bias). We’ll share what we learn so that our users and the broader community can build more human-positive AI systems.
A knowledge graph is a knowledge base that uses a graph-structured data model or topology to integrate knowledge and data. Knowledge graphs are often used to store interlinked descriptions of entities — real-world objects, events, situations or abstract concepts — with free-form semantics, not fitting into a single traditional ontology.
Since the development of the Semantic Web, knowledge graphs are often associated with linked open data projects, focusing on the connections between concepts and entities. The are also prominently associated with and used by search engines such as Google, Bing, and Yahoo; knowledge-engines and question-answering services such as WolframAlpha, Apple’s Siri, and Amazon Alexa; and social networks such as LinkedIn and Facebook.
MongoDB, Inc. announced a series of products that comprise the MongoDB Cloud platform that give developers a better way to work with data, wherever it resides. The launch of MongoDB 4.4, general availability of Atlas Data Lake and Atlas Search, and the general availability of MongoDB Realm offers organizations an escape from data silos and fragmented APIs as MongoDB Cloud delivers a developer-optimized, cloud-to-mobile platform. With MongoDB’s document data model, developers can structure data any way the application requires – from rich, hierarchical objects to simple key-value pairs and tables to connected graphs – and then query it with a single API. This gives developers a consistent and productive experience across the broadest set of workloads.
The addition of Atlas Data Lake and Atlas Search to the MongoDB Cloud platform simplifies modern data infrastructure, extends applications with rich search experiences and unlocks analytics for data archived in a data lake. Using the same MongoDB Query Language (MQL) and data model, with Atlas Data Lake a user can run a query and have the data brought back to them: whether it is real-time transactional data in the global Atlas global cloud database or a relevance-based search query with Atlas Search or a long-running analytical query on data in object storage. Using MongoDB Cloud, developers no longer need to deal with the cognitive burden of flipping back and forth between multiple technologies, query languages and data models. MongoDB’s Realm Sync enables bi-directional data synchronization between Realm’s mobile client on the front end and Atlas on the backend. This allows for data to be seamlessly shared between devices and with the backing database without complex conflict resolution and integration code.
SDL announced its machine translation (MT) technology is now available through the Reynen Court LLC platform, enabling legal firms and departments to provision and deploy SDL Machine Translation to securely translate any type of legal document or file. The Reynen Court platform combines a content-rich solution store and a control panel that simplifies the process for law firms and legal departments to source, evaluate, deploy, monitor and manage legal technology applications. Platform users can employ a multicloud technology strategy that includes on-premises data centers and virtual private clouds under the platform user’s control without compromising informational security, environmental stability, and infrastructure control. The platform also allows users to manage software subscriptions and evaluate usage and consumption metrics from across its tech stack in one place, permitting them to optimize technological investment through the deployment of legal technology. Reynen Court was established with support from a consortium of 19 leading global law firms. The latest version of SDL Machine Translation goes beyond automatic translation and integrates with multiple platforms to power digital customer experience, eDiscovery, due diligence, contract review, analytics, internal communications and collaboration.
Webiny announced the availability of Webiny Serverless Headless CMS (beta). When you look at the headless CMS market there are several options you can choose from, but none of the options are both serverless and open-source at the same time. Half of them run on “traditional” infrastructures, like virtual machines, and the other half is made of standard SaaS products. The goal was to build something that scales to handle huge amounts of traffic out of the box, no matter how spikey, with a solution that is customizable and has zero overhead when it comes to managing infrastructure. Today, this is only achievable with serverless infrastructure.
Included in the package:
- Content modeling interface — You can not only model your content, but also build the interface for how the input forms will look like to your content editors. Place form inputs inside a grid layout and split it into multiple columns and rows.
- Content localization — A simple and intuitive way to input and serve content in multiple languages. We keep the interface for editors clean and easy to use, no matter if you have 1 language or 20 languages.
- GraphQL API — The API is at the core of every headless CMS. If you get the API wrong, the whole product has a poor experience. This is why we spent a significant part of our effort on ensuring developers have a great experience using our API. The API also comes with a built-in GraphQL playground, so it’s super simple to inspect your schema and test your queries.
- Environments and aliases — With a single click copy your existing data into a new environment. Modify it and update it without affecting your production site. Finally, remap the alias to switch the new environment into production. With this approach, you get instant rollback feature as well as there is no need to update and redeploy your code to any of your devices when you make changes.
- Customizable and extendable platform with a microservices architecture — Having a headless CMS is great, but what if it only gets you halfway? What if you need to build custom code, add logic with specific rules that are outside the scope of the headless CMS. Using the Serverless Web Development Framework you can build any type of logic your project requires and deploy it alongside the headless CMS as a separate microservice.
Xero, the global small business platform, has announced the release of new search functionality on Xero’s app marketplace. With more than 800 third party apps that connect to the platform, Xero’s app marketplace now serves up suggestions based on a small business’ profile when they are logged into Xero and an improved search toolbar presents popular apps and quick links, providing a more personalized, intuitive, and efficient experience. The new search functionality is powered by Coveo’s recommendations engine, using machine learning to serve up app suggestions based on a small business’ profile when they are logged into Xero.