Curated for content, computing, data, information, and digital experience professionals

Category: Computing & data (Page 60 of 91)

Computing and data is a broad category. Our coverage of computing is largely limited to software, and we are mostly focused on unstructured data, semi-structured data, or mixed data that includes structured data.

Topics include computing platforms, analytics, data science, data modeling, database technologies, machine learning / AI, Internet of Things (IoT), blockchain, augmented reality, bots, programming languages, natural language processing applications such as machine translation, and knowledge graphs.

Related categories: Semantic technologies, Web technologies & information standards, and Internet and platforms.

MarkLogic announces new Tableau connector

MarkLogic Corporation announced the availability of its new connector in the Tableau Extension Gallery, enabling fast and easy connections to the Tableau analytics platform. In the past few years, the explosion of data has created a disconnect between data aggregation and data consumption. Data silos have proliferated but business analysts only have access to a small subset of data in the enterprise. MarkLogic solves the problem by integrating multi-structured data from silos, curating that data, and making it fit-for-purpose for downstream consumption. Now, with just a few clicks, users can connect Tableau to MarkLogic and open up access to that curated data and get a more complete view of their business.

https://extensiongallery.tableau.com/connectors/202

Microsoft announces DeBERTa surpasses human performance on SuperGLUE

Microsoft announced that DeBERTa now surpasses humans on the SuperGLUE benchmark. SuperGLUE is a challenging benchmarks for evaluating NLU models. The benchmark consists of a wide range of NLU tasks, including question answering, natural language inference, co-reference resolution, word sense disambiguation, and others. Top research teams around the world have been developing large-scale pretrained language models (PLMs) that have driven performance improvement on the SuperGLUE benchmark. Microsoft recently updated the DeBERTa model by training a larger version that consists of 48 Transformer layers with 1.5 billion parameters. The performance boost makes the single DeBERTa model surpass the human performance on SuperGLUE for the first time in terms of macro-average score (89.9 versus 89.8), and the ensemble DeBERTa model sits atop the SuperGLUE benchmark rankings, outperforming the human baseline by a decent margin (90.3 versus 89.8). The model also sits at the top of the GLUE benchmark rankings with a macro-average score of 90.8.

Microsoft will release the 1.5-billion-parameter DeBERTa model and the source code to the public. In addition, DeBERTa is being integrated into the next version of the Microsoft Turing natural language representation model (Turing NLRv4). Our Turing models converge all language innovation across Microsoft, and they are then trained at large scale to support products like Bing, Office, Dynamics, and Azure Cognitive Services.

https://www.microsoft.com/en-us/research/blog/microsoft-deberta-surpasses-human-performance-on-the-superglue-benchmark/

Blue Prism accelerates intelligent automation on Microsoft Azure

Blue Prism announced a new offering of Blue Prism intelligent automation software on Microsoft’s AppSource and Azure Marketplaces. The move enhances access for both Blue Prism and Microsoft customers. Blue Prism customers already have access to a scalable, enterprise-ready platform that combines robotic automation and smart workflows with technologies like machine learning, advanced analytics, natural language processing, process mining, and cognitive capabilities, and this offering allows Blue Prism robots greater access to Microsoft Azure Apps too, with access to over 175 accelerators for Microsoft products within Blue Prism’s Digital Exchange.

The new Bring Your Own License (BYOL) offering for Azure Marketplace and AppSource is pre-loaded with select Azure Cognitive Services – including Azure Text Analytics, Azure Form Recognizer and Azure Computer Vision – all of which customers can license directly through Microsoft. This combines with Blue Prism Digital Exchange where users can access more than 175 accelerators for Microsoft products to enhance their enterprise automations. Blue Prism accelerators now exists for Microsoft Power Platform, Microsoft’s Power Automate gallery, Microsoft’s Healthcare Cloud; with Form Recognizer, Text Analytics and Azure Computer Vision.

http://blueprism.com/

AI-OCR “DX Suite” to support multiple languages

AI inside Inc. announced a new service to support multiple languages on AI-OCR “DX Suite”, through the release of new AI-engine that can recognize English, Traditional Chinese, Thai, and Vietnamese characters. Through this AI inside will begin its global expansion starting with the Asian market. Through the provision of AI-OCR “DX Suite”, AI inside has contributed to improve the operational efficiency and productivity of companies and municipalities in Japan through the high accuracy recognition of both printed and handwritten characters. AI inside has also been developing a foreign language recognition AI-engine in order to expand the availability of “DX Suite” beyond Japan to other countries around the world. This foreign language AI-engine has now achieved commercially viable accuracy to recognize characters of English, Traditional Chinese, Thai, and Vietnamese, and now announce the availability of this multilingual service on the cloud version of “DX Suite”. For current users of “DX Suite” cloud version it is possible to utilize this multilingual service without any additional registration.

https://dx-suite.com/global/lp/

Hitachi Solutions and Allganize partner on NLP, AI, RPA

Hitachi Solutions, Ltd. announced the release of “Business Efficiency Solution by Natural Language Processing (NLP) AI.” It combines Hitachi and Allganize’s expertise in Artificial Intelligence (AI) to provide business automation solutions, such as Robotic Process Automation (RPA), NLP and cognitive agents. Allganize’s AI technology is able to extract structured knowledge from millions of documents and other content sources. Hitachi’s “Katsubun Intellectual Information Mining” provides a set of tools and techniques to identify specific ways to improve business operations. The solution features:

  • Katsubun Intellectual Information Mining analyzes clients’ operational processes and enables them to identify bottlenecks for process improvement.
  • Alli AnswerBot streamlines customer service and automates business processes.
  • Cognitive Search enriches and extracts structured knowledge from unstructured data sources to make content more searchable.
  • Named Entity Recognition (NER) locates and classifies named entities such as brand names, people, product names, purchase order and account information from unstructured data sources.
  • Review Analysis automatically converts opinions such as user feedback and reviews into measurable data for analysis.
  • Text Classification assigns custom predefined categories to free-text for managing content.
  • Sentiment Analysis, emotional AI that categorizes text by emotional state such as happy, angry or sad.

https://www.hitachi-solutions.co.jp/allganize/, https://allganize.ai

ReadSpeaker and SoundHound partner

ReadSpeaker, an independent digital voice partner for global businesses, announced it has partnered with SoundHound, Inc., provider of voice AI and conversational intelligence technologies to include ReadSpeaker’s text-to-speech (TTS) technology on the Houndify Voice AI platform. Developers using Houndify will have the ability to add ReadSpeaker’s hyper-personalized and lifelike voices to their custom voice assistants. This capability is critical as personalized text-to-speech voices allow brands to be in complete control over every aspect of their conversational UI. Houndify is an independent voice AI platform that provides a full stack of tools and technologies needed for brands to create custom voice assistants with speed and accuracy. By leveraging their proprietary Speech-to-Meaning and Deep Meaning Understanding technologies, Houndify enables voice assistants to understand even the most complex and compound queries using natural language understanding.

https://www.readspeaker.com, https://www.houndify.com/

Cambridge Quantum Computing advances ‘meaning-aware’ quantum natural language processing

Cambridge Quantum Computing (CQC) announced that it has built on earlier advances in “meaning- aware” Quantum Natural Language Processing (QNLP), establishing that QNLP is quantum-native with expected near-term advantages over classical computers. Natural language processing (NLP) is at the forefront of advances in contemporary artificial intelligence, and it is arguably one of the most challenging areas of the field. “Meaning-aware” NLP remains a distant aspiration using classical computers. The steady growth of quantum hardware and notable improvements in the implementation of quantum algorithms mean that we are approaching an era when quantum computers might perform tasks that cannot be done on classical computers with a reasonable amount of resources in a repeatable manner, and which are important and suitable for everyday use. In papers posted on arXiv – the scientific e-print repository, CQC’s scientists provide conceptual and mathematical foundations for near-term QNLP in quantum computer scientist-friendly terms. The paper is written in an expository style with tools that provide mathematical generality.

Aiming to canonically combine linguistic meanings with rich linguistic structure, most notably grammar, Professor Bob Coecke (Oxford University) and his team have proven that a quantum computer can achieve “meaning aware” NLP, thus establishing QNLP as quantum-native, on par with the simulation of quantum systems. Moreover, the leading Noisy Intermediate-Scale Quantum (NISQ) paradigm for encoding classical data on quantum hardware – variational quantum circuits – makes NISQ exceptionally QNLP-friendly.

https://cambridgequantum.com, https://arxiv.org/pdf/2012.03755.pdf

DataStax delivers new API stack

DataStax announced a new API stack for modern data apps. Stargate, an open-source API framework for data first unveiled this summer, is now generally available in DataStax’s Astra cloud database and for free download on GitHub. The integration of Stargate into Astra enables developers to use any data store for modern data apps by adding support for new APIs, data types, and access methods. Developers no longer need to work with different databases and different APIs to power modern data apps. Developers can build data apps with:

  • Apache Cassandra: the open-source NoSQL database to manage data at global-scale.
  • K8ssandra: an open-source distribution that enables elastic scale for data on Kubernetes.
  • Stargate: an open-source API framework that enables developers to use their choice of schemaless JSON, GraphQL, and REST APIs.

And benefit from:

  • Choice of APIs — Developers can use their choice of the REST API, GraphQL API or schemaless Document API to access data.
  • No modeling — By using the Document API, developers can store JSON objects in Astra, without doing up front modeling. Developers can easily prototype without having to pre-define schema and queries.

https://www.datastax.com/blog/2020/12/announcing-stargate-10-astra-rest-graphql-schemaless-json-your-cassandra-development

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑