Curated for content, computing, and digital experience professionals

Category: Computing & data (Page 58 of 90)

Computing and data is a broad category. Our coverage of computing is largely limited to software, and we are mostly focused on unstructured data, semi-structured data, or mixed data that includes structured data.

Topics include computing platforms, analytics, data science, data modeling, database technologies, machine learning / AI, Internet of Things (IoT), blockchain, augmented reality, bots, programming languages, natural language processing applications such as machine translation, and knowledge graphs.

Related categories: Semantic technologies, Web technologies & information standards, and Internet and platforms.

Google introduces table-to-text generation dataset

Google introduced “ToTTo: A Controlled Table-To-Text Generation Dataset”, an open domain table-to-text generation dataset created using a novel annotation process (via sentence revision) along with a controlled text generation task that can be used to assess model hallucination. ToTTo (shorthand for “Table-To-Text”) consists of 121,000 training examples, along with 7,500 examples each for development and test. Due to the accuracy of annotations, this dataset is suitable as a challenging benchmark for research in high precision text generation. The dataset and code are open-sourced on our GitHub repo.

In the last few years, research in natural language generation, used for tasks like text summarization, has made tremendous progress. Yet, despite achieving high levels of fluency, neural systems can still be prone to hallucination (i.e.generating text that is understandable, but not faithful to the source), which can prohibit these systems from being used in many applications that require high degrees of accuracy.

While the process of assessing the faithfulness of generated text to the source content can be challenging, it is often easier when the source content is structured (e.g., in tabular format). Moreover, structured data can also test a model’s ability for reasoning and numerical inference. However, existing large scale structured datasets are often noisy (i.e., the reference sentence cannot be fully inferred from the tabular data), making them unreliable for the measurement of hallucination in model development.

https://ai.googleblog.com/2021/01/totto-controlled-table-to-text.html

Oracle releases Oracle Database 21c

Oracle announced that Oracle Database 21c is available on Oracle Cloud, including the Always Free tier of Oracle Autonomous Database. Oracle Database 21c contains more than 200 new capabilities, including immutable blockchain tables, In-Database JavaScript, native JSON binary data type, AutoML for in-database machine learning (ML), and persistent memory store, as well as enhancements for in-memory, graph processing performance, sharding, multitenant, and security. Oracle Database 21c provides support for multi-model, multi-workload, and multi-tenant requirements within a single, converged database engine.

In addition, Oracle announced the availability of Oracle APEX (Application Express) Application Development, a new low-code service for developing and deploying data-driven enterprise applications quickly and easily. The browser-based, low-code cloud service enables developers to create modern web apps for desktops and mobile devices using an intuitive graphical interface.

Oracle Database 21c is the database engine that powers Oracle database services in the cloud and on-premises, including Oracle Autonomous Database, Oracle Exadata Database Service, Oracle Exadata Database Cloud@Customer, and Oracle Exadata Database Machine.

https://www.oracle.com/news/announcement/oracle-database-21c-011321.html

Expert.ai adds commercial options for NL API

Expert.ai announced the availability of commercial options for its NL API. Powered by expert.ai technology, which mimics the human ability to understand language and complex textual information, the expert.ai NL API provides advanced linguistic analysis out of the box, so developers and data scientists can reduce development costs and increase productivity while optimizing their natural language processing (NLP) applications. The introduction of the commercial offering builds upon the previously released freemium model to give users the ability to expand their usage of the expert.ai NL API and scale their apps, either standalone or embedded into an enterprise’s existing AI processes, by supporting larger volumes of data.

Expert.ai NL API features include linguistic analysis tools, embedded taxonomies for classification, and sentiment analysis. By resolving complexity associated with language, it streamlines the development of apps that rely on and process natural language content and unstructured data (contracts, internal database, news articles, academic papers, customer service emails, insurance policies, medical reports etc.) that typically is either not analyzed because it is inefficient or is analyzed manually which is expensive.

https://www.expert.ai/products/nl-api/, https://policies.expert.ai/nlapi/pricing/

Nuance launches patient engagement virtual assistant platform

Nuance Communications, Inc. launched an AI-powered patient engagement virtual assistant platform to transform voice and digital experiences across the patient journey. Combining healthcare expertise with intelligent engagement technology, the platform integrates and extends the capabilities of the electronic health record (EHR), customer relationship management (CRM), and Patient Access Center systems to enable healthcare provider organizations to modernize their ‘digital front door’ and improve clinical care. Leveraging the same conversational AI technology that consumer brands use to power their provider facing virtual assistant solutions, Nuance’s patient engagement platform now enables healthcare provider organizations to deliver improved patient experiences. Nuance’s new patient engagement platform provides an array of leading capabilities and business outcomes including:

  • Seamless, consistent and unified omnichannel experiences – No longer do healthcare organizations need separate siloed virtual assistant/bot systems for their voice (IVR), web, mobile/SMS and smart speaker/IoT devices.
  • Integrates and extends capabilities of core systems infrastructure – including the EHR, Patient Financial systems, CRM and patient access center (call center).
  • Provides “out-of-the-box” solutions and an advanced Do-It-Yourself (DIY) development tool.
  • Advanced and unified data analytics.
  • Runs on the Microsoft Azure HITRUST CSF-certified cloud platform.

https://www.nuance.com

Pega acquires Qurious.io for speech analytics

Pegasystems Inc. announced its acquisition of Qurious.io, Inc., a cloud-based real-time speech analytics solution powered by artificial intelligence (AI) for customer service teams. Terms of the deal are not being disclosed. Qurious.io’s software-as-a-service (SaaS) offering uses speech-to-text, natural language processing (NLP), and emotion detection capabilities to analyze the dialog within each customer service call as it happens. The software then provides agents with real-time insights and coaching so they can improve customer interactions, make better recommendations, and boost customer loyalty and sales. Pega plans to add Qurious.io’s capabilities to its software portfolio with an initial focus on Pega Customer Service use cases.

https://www.qurious.io/home, https://www.pega.com/about/news/press-releases/pega-acquires-quriousio-ai-powered-speech-analytics

MarkLogic announces new Tableau connector

MarkLogic Corporation announced the availability of its new connector in the Tableau Extension Gallery, enabling fast and easy connections to the Tableau analytics platform. In the past few years, the explosion of data has created a disconnect between data aggregation and data consumption. Data silos have proliferated but business analysts only have access to a small subset of data in the enterprise. MarkLogic solves the problem by integrating multi-structured data from silos, curating that data, and making it fit-for-purpose for downstream consumption. Now, with just a few clicks, users can connect Tableau to MarkLogic and open up access to that curated data and get a more complete view of their business.

https://extensiongallery.tableau.com/connectors/202

Microsoft announces DeBERTa surpasses human performance on SuperGLUE

Microsoft announced that DeBERTa now surpasses humans on the SuperGLUE benchmark. SuperGLUE is a challenging benchmarks for evaluating NLU models. The benchmark consists of a wide range of NLU tasks, including question answering, natural language inference, co-reference resolution, word sense disambiguation, and others. Top research teams around the world have been developing large-scale pretrained language models (PLMs) that have driven performance improvement on the SuperGLUE benchmark. Microsoft recently updated the DeBERTa model by training a larger version that consists of 48 Transformer layers with 1.5 billion parameters. The performance boost makes the single DeBERTa model surpass the human performance on SuperGLUE for the first time in terms of macro-average score (89.9 versus 89.8), and the ensemble DeBERTa model sits atop the SuperGLUE benchmark rankings, outperforming the human baseline by a decent margin (90.3 versus 89.8). The model also sits at the top of the GLUE benchmark rankings with a macro-average score of 90.8.

Microsoft will release the 1.5-billion-parameter DeBERTa model and the source code to the public. In addition, DeBERTa is being integrated into the next version of the Microsoft Turing natural language representation model (Turing NLRv4). Our Turing models converge all language innovation across Microsoft, and they are then trained at large scale to support products like Bing, Office, Dynamics, and Azure Cognitive Services.

https://www.microsoft.com/en-us/research/blog/microsoft-deberta-surpasses-human-performance-on-the-superglue-benchmark/

Blue Prism accelerates intelligent automation on Microsoft Azure

Blue Prism announced a new offering of Blue Prism intelligent automation software on Microsoft’s AppSource and Azure Marketplaces. The move enhances access for both Blue Prism and Microsoft customers. Blue Prism customers already have access to a scalable, enterprise-ready platform that combines robotic automation and smart workflows with technologies like machine learning, advanced analytics, natural language processing, process mining, and cognitive capabilities, and this offering allows Blue Prism robots greater access to Microsoft Azure Apps too, with access to over 175 accelerators for Microsoft products within Blue Prism’s Digital Exchange.

The new Bring Your Own License (BYOL) offering for Azure Marketplace and AppSource is pre-loaded with select Azure Cognitive Services – including Azure Text Analytics, Azure Form Recognizer and Azure Computer Vision – all of which customers can license directly through Microsoft. This combines with Blue Prism Digital Exchange where users can access more than 175 accelerators for Microsoft products to enhance their enterprise automations. Blue Prism accelerators now exists for Microsoft Power Platform, Microsoft’s Power Automate gallery, Microsoft’s Healthcare Cloud; with Form Recognizer, Text Analytics and Azure Computer Vision.

http://blueprism.com/

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑