Curated for content, computing, and digital experience professionals

Category: Content technology news (Page 109 of 624)

Curated information technology news for content technology, computing, and digital experience professionals. News items are edited to remove hype, unhelpful jargon, iffy statements, and quotes, to create a short summary — mostly limited to 200 words — of the important facts with a link back to a useful source for more information. News items are published using the date of the original source here and in our weekly email newsletter.

We focus on product news, but also include selected company news such as mergers and acquisitions and meaningful partnerships. All news items are edited by one of our analysts under the NewsShark byline.  See our Editorial Policy.

Note that we also publish news on X/Twitter. Follow us  @gilbane

Northern Light adds content from consultants and researchers to knowledge management platform

Northern Light announced its SinglePoint knowledge management platform now contains a search index of reports published by “thought leaders” from more than 15 business strategy consulting firms, “think tanks” and non-governmental research organizations. Northern Light’s new Thought Leaders content collection features insights across a range of industries and strategy topics from firms such as Accenture, API, Bain, BCG, Capgemini, Cognizant, Deloitte, Ernst & Young, IBM Institute for Business Value, KPMG, MITRE, Pew Research Center, PwC, Tata Consultancy Services, World Economic Forum (WEF), and the World Health Organization (WHO). Initially, the collection contains approximately 20,000 market and technology reports; Northern Light expects to add an additional 1,000 new reports per month to the search index. The reports in the collection contain analysis, commentary, and forecasts of the trends in industries such as life sciences, healthcare, information technology, financial services, and consumer products.

Key topics, such as marketing to Millennials and post-Millennials, are covered in depth. The search index includes links to the reports on the thought leaders’ websites, and all of the reports in the collection are available to users. In addition to the new thought leaders content set, other content collections that can be accessed within SinglePoint include an organization’s own primary research, licensed secondary research, industry news, technology vendor white papers, conference abstracts, and various government and industry databases.

https://northernlight.com/

Vectorspace AI & CERN create Natural Language Processing datasets

Vectorspace AI and CERN, the European Organization for Nuclear Research and the largest particle physics laboratory in the world, are creating datasets used to detect hidden relationships between particles which have broad implications across multiple industries. These datasets can provide a significant increase in precision, accuracy, signal or alpha and for any company in any industry. Datasets are algorithmically generated based on formal Natural Language Processing/Understanding (NLP/NLU) models including OpenAI’s GPT-3, Google’s BERT along with word2vec and other models which were built on top of vector space applications at Lawrence Berkeley National Laboratory and the US Dept. of Energy (DOE). Over 100 billion different datasets are available based on customized data sources, rows, columns or language models.

For commercial use, datasets are $0.99c per minute/update and $0.99c per data source, row, column and context with additional configurations and options available on a case by case SaaS/DaaS based monthly subscription. Over 100 billion unique and powerful datasets are available based on customized data sources, rows, columns or language models.

While data can be viewed as unrefined crude oil, Vectorspace AI produces datasets which are the refined ‘gasoline’ powering all Artificial Intelligence (AI) and Machine Learning (ML) systems. Datasets are real-time and designed to augment or append to existing proprietary datasets such as gene expression datasets in life sciences or time-series datasets in the financial markets. Example customer and industry use cases include:

Particle Physics: Rows are particles. Columns are properties. Used to predict hidden relationships between particles.

Life Sciences: Rows are infectious diseases. Columns are approved drug compounds. Used to predict which approved drug compounds might be repurposed to fight an infectious disease such as COVID19. Applications include processing 1500 peer reviewed scientific papers every 24hrs for real-time dataset production.

Financial Markets: Rows are equities. Columns are themes or global events. Used to predict hidden relationships between equities and global events. Applications include thematic investing and smart basket generation and visualization.

Data provenance, governance and security are addressed via the Dataset Pipeline Processing (DPP) hash blockchain and VXV utility token integration. Datasets are accessed via the VXV wallet-enable API where VXV is acquired and used as a utility token credit which trades on a cryptocurrency exchange.

https://vectorspace.ai

Serviceaide announces Luma 2.5

Serviceaide announced Luma 2.5. Luma Knowledge is integral to Luma Virtual Agent 2.5. Luma 2.5 leverages knowledge and information by unifying Serviceaide’s AI-powered virtual agent with an enterprise knowledge repository, the Luma Enterprise Knowledge Hub. The combination accelerates and elevates the self-service experience.

The Luma Virtual Agent leverages natural language processing and machine learning to create a conversational interface via voice, email, chat and other channels that understands and proactively guides users to the answers they seek or fulfills their requests through automated services. Luma’s automation and workflow engine can automate a wide range of IT and Enterprise Service Management tasks as diverse as provisioning a virtual machine, onboarding new employees, and handling facilities requests and HR changes. Among the knowledge-centered capabilities of the Luma Virtual Agent 2.5:

  • Serves up knowledge by forging an understanding between the data and the end requester.
  • Differentiates between user needs by disambiguating requests for service and knowledge and delivering the information or services that best fits the request. This can be done by exploring knowledge articles or reviewing actionable skills.
  • Closes the loop by leveraging machine learning to continuously improve knowledge delivery by providing feedback from its usefulness back into the knowledge base
  • Provides better answers by offering contextual suggestions for new content and more effective/timely responses.
  • Builds Knowledge by gathering existing knowledge, highlighting gaps, and spotlighting where knowledge needs to be created or improved.

Microsoft announces SharePoint Syntex

From the Microsoft Project Cortex blog:

Microsoft announced SharePoint Syntex, the first product from Project Cortex. SharePoint Syntex uses advanced AI and machine teaching to amplify human expertise, automate content processing, and transform content into knowledge, and will be available to purchase for all Microsoft 365 commercial customers on October 1, 2020.

Machine teaching accelerates the creation of AI models by acquiring knowledge from people rather than from large datasets alone. Any information processing skill, that an expert can teach a human, should be easily teachable to a machine. SharePoint Syntex mainstreams machine teaching, enabling your experts to capture their knowledge about content in AI models they can build with no code. Your experts train SharePoint Syntex to understand content like they do, to recognize key information, and to tag content automatically. For example, a contract processing expert can teach SharePoint Syntex to extract the contract’s value, along with the expiration date and key terms and conditions.

SharePoint Syntex then uses your models to automate the capture, ingestion, and categorization of content, extracting valuable information as metadata. Metadata is critical to managing content, and seamless integration with Microsoft Search, Power Automate, and Microsoft Information Protection enable you to improve knowledge discovery and reuse, accelerate processes, and dynamically apply information protection and compliance policies.

SharePoint Syntex content center
Syntex introduces a new experience for managing content at scale, integrating metadata and workflow, and delivering compliance automation – the content center. Content centers supply capabilities to teach the cloud how to read and process documents the same way you would manually. SharePoint Syntex uses those insights to automatically recognize content, extract important information, and apply metadata tags. SharePoint Syntex uses advanced AI to automate the capture, ingestion, and categorization of content, to accelerate processes, improve compliance, and facilitate knowledge discovery and reuse. SharePoint Syntex mainstreams AI to process three major types of content: digital images, structured or semi-structured forms, and unstructured documents.

Digital image processing
SharePoint Syntex can automatically tag images using a new visual dictionary with thousands of commonly recognized objects. In addition, SharePoint Syntex can recognize convert extracted handwritten text into tags for search and further processing.

Document understanding
Most organizations generate vast amounts of unstructured documents such as manuals, contracts, or resumes. You can teach SharePoint Syntex to read your content the way you would using machine teaching to build AI models with no code. SharePoint Syntex can automatically suggest or create metadata, invoke custom Power Automate workflows, and attach compliance labels to enforce retention or record management policies. Document understanding models are based on Language Understanding models in Azure Cognitive Services.

Form processing
SharePoint Syntex includes a powerful form processing engine, based on AI Builder, that lets you automatically recognize and extract common values from semi structured or structured documents, such as dates, figures, names, or addresses. These models are built with no code and only require a small number of documents for reliable results.

https://techcommunity.microsoft.com/t5/project-cortex-blog/announcing-sharepoint-syntex/ba-p/1681139

Microsoft teams up with OpenAI to exclusively license GPT-3 language model

An edited version of the announcement from the Microsoft Blog:

Microsoft is teaming up with OpenAI to exclusively license GPT-3 – an autoregressive language model that outputs human-like text. GPT-3 is the largest and most advanced language model in the world, clocking in at 175 billion parameters, and is trained on Azure’s AI supercomputer. This allows us to leverage its technical innovations to develop and deliver advanced AI solutions for our customers, as well as create new solutions that harness the power of advanced natural language generation.

We see this as an opportunity to expand our Azure-powered AI platform in a way that democratizes AI technology, enables new products, services and experiences, and increases the positive impact of AI at Scale. We want to make sure that this AI platform is available to everyone – researchers, entrepreneurs, hobbyists, businesses – to empower their ambitions to create something new and interesting. The scope of commercial and creative potential that can be unlocked through the GPT-3 model is profound, with genuinely novel capabilities – most of which we haven’t even imagined yet. Directly aiding human creativity and ingenuity in areas like writing and composition, describing and summarizing large blocks of long-form data (including code), converting natural language to another language.

Realizing these benefits at true scale – responsibly, affordably and equitably – is going to require more human input and effort than any one large technology company can bring to bear. OpenAI will continue to offer GPT-3 and other powerful models via its own Azure-hosted API, launched in June. While we’ll be hard at work utilizing the capabilities of GPT-3 in our own products, services and experiences to benefit our customers, we’ll also continue to work with OpenAI to keep looking forward: leveraging and democratizing the power of their cutting-edge AI research as they continue on their mission to build safe artificial general intelligence.

https://blogs.microsoft.com/blog/2020/09/22/microsoft-teams-up-with-openai-to-exclusively-license-gpt-3-language-model/

HubSpot launches new features and updates at INBOUND 2020

HubSpot introduced new features and updates to help businesses meet the challenges brought on by the COVID-19 crisis and seize new opportunities in the post-pandemic world. The additions include an enterprise sales CRM, a scalable contacts pricing model, expanded personalization functionality, and more — giving companies greater ability to unify their marketing, sales, and service efforts and build a delightful customer experience. Today’s announcements include:

  • Sales Hub Enterprise — an enterprise sales CRM with custom objects, advanced permissions, and sophisticated reporting, as well as enhanced sales acceleration tools and configure-price-quote functionality.
  • Scalable pricing — a new pricing model that enables companies to only pay for the contacts they actively market to.
  • Upgrades to Marketing Hub, including advanced personalization tools, a new report builder, and custom objects.
  • Service Hub enhancements, such as logged-in visitor identification, help desk automation, and team management functionality, in addition to a multi-language knowledge base.
  • Improvements to the HubSpot ecosystem, including a redesigned Solutions Directory and new remote work integrations in the App Marketplace.

To learn more about the product announcements HubSpot made at INBOUND, please visit:

https://www.hubspot.com/new

ThoughtSpot launches ThoughtSpot Cloud for access to cloud data warehouses

ThoughtSpot announced the release of ThoughtSpot Cloud, a fully-managed SaaS offering providing business users with the flexibility to glean instant insights from data in the cloud with search & AI-driven analytics. With ThoughtSpot Cloud, employees can access data across all of their cloud data in a matter of minutes, helping these organizations maximize their investments in cloud data warehouses like Amazon Redshift and Snowflake. Additional features include:

  • Personalized onboarding: Specific onboarding flows by role tailor the experience for users, accelerating their time to value.
  • Search assist: Digital assistant that provides a step by step guide for first time users to aid in their initial search.
  • Prebuilt SpotApps: Reusable low-code templates to make getting insights from a particular application, like Salesforce, simple and scalable.
  • In-database benefits: Run queries directly in both high-performance, zero-management, built-for-the-cloud data warehouses like Amazon Redshift and Snowflake.
  • Pricing: Pay only for the data consumed and analyzed, not for the number of users.

https://www.thoughtspot.com/cloud

MathWorks introduces Release 2020b of MATLAB and Simulink

MathWorks introduced Release 2020b of the MATLAB and Simulink product families. New capabilities in MATLAB simplify working with graphics and apps, and Simulink updates focus on expanded access and speed, including the launch of Simulink Online for access through web browsers. R2020b also introduces new products that build on artificial intelligence (AI) capabilities, speed up autonomous systems development, and accelerate creation of 3D scenes for automated driving simulation. More details are available in the Release 2020b video.

Among the hundreds of new and updated features, MATLAB adds new bubble and swarm charts, the ability to diff and merge App Designer apps with the MATLAB Comparison Tool, and customizable figure icons and components to MATLAB apps. Also, in addition to Simulink Online to view, edit, and simulate Simulink models through web browsers, R2020b adds the ability to generate code up to 2X faster for referenced model hierarchies in Simulink and includes new automerge functionality that helps automate continuous integration workflows.

https://www.mathworks.com/products/new_products/latest_features.html

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑