Curated for content, computing, and digital experience professionals

Category: Content technology news (Page 39 of 624)

Curated information technology news for content technology, computing, and digital experience professionals. News items are edited to remove hype, unhelpful jargon, iffy statements, and quotes, to create a short summary — mostly limited to 200 words — of the important facts with a link back to a useful source for more information. News items are published using the date of the original source here and in our weekly email newsletter.

We focus on product news, but also include selected company news such as mergers and acquisitions and meaningful partnerships. All news items are edited by one of our analysts under the NewsShark byline.  See our Editorial Policy.

Note that we also publish news on X/Twitter. Follow us  @gilbane

W3C to become a public-interest non-profit organization

From the W3C…

The World Wide Web Consortium is set to pursue 501(c)(3) non-profit status. The launch as a new legal entity in January 2023 preserves the core mission of the Consortium to shepherd the web by developing open standards with contributions from W3C Members, staff, and the international community.

At the operational level, which is not changing, W3C Members are bound together for our technical work, united around the W3C’s mission to lead the web to its full potential by creating open standards that ensure that the web remains open, accessible, internationalized, secure, and interoperable for everyone around the globe.

We need a structure where we meet at a faster pace the demands of new web capabilities and address the urgent problems of the web. The W3C Team is small, bounded in size, and the Hosted model hinders rapid development and acquisition of skills in new fields.

We need to put governance at the center of the new organization to achieve clearer reporting, accountability, greater diversity and strategic direction, better global coordination. A Board of Directors will be elected with W3C Member majority. It will include seats that reflect the multi-stakeholder goals of the Web Consortium. We anticipate to continue joint work with today’s Hosts in a mutually beneficial partnership.

As important as all these points are, they only represent a change to the shell around W3C. The proven standards development process must and will be preserved.

W3C processes promote fairness, enable progress. Our standards work will still be accomplished in the open, under the W3C Process Document and royalty-free W3C Patent Policy, with input from the broader community. Decisions will still be taken by consensus. Technical direction and Recommendations will continue to require review by W3C Members – large and small. The Advisory Board will still guide the community-driven Process Document enhancement. The Technical Architecture Group will continue as the highest authority on technical matters.

Our transition to launch the legal entity includes concrete stages – adoption of Bylaws: filing for 501(c)(3) non-profit status; election and seating of a Board of Directors – all to transfer staff, Member contracts, and operations to the new structure.

https://www.w3.org/2022/06/pressrelease-w3c-le.html.en

Tellius and Databricks partner to democratize data analysis

Tellius announced a partnership with Databricks to give joint customers the ability to run Tellius natural language search queries and automated insights directly on the Databricks Lakehouse Platform, powered by Delta Lake, without the need to move any data.

With Tellius, organizations can search and analyze their data to identify what is happening with natural language queries, understand why metrics are changing via AI-powered Insights, and determine next best actions with deep insights and AutoML. Connecting to Delta Lake on Databricks only takes a few clicks, and then users can perform a natural language search of their unaggregated structured and unstructured data to answer their own questions. They can drill down to get granular insights, leverage single-click AI analysis to uncover trends, key drivers, and anomalies in their data, and create predictive models via AutoML in Tellius. Answers and insights can be utilized to write back to source applications to operationalize insights. Faster data collaboration helps democratize data access across analytics teams with less worrying about performance or IT maintenance.

https://www.tellius.com/tellius-and-databricks-partner-to-deliver-ai-powered-decision-intelligence-for-the-data-lakehouse/

TransPerfect GlobalLink CCMS upgrades Acrolinx Connector for Astoria

TransPerfect, provider of language and technology solutions for global business, announced that its connector between the GlobalLink CCMS Astoria platform and Acrolinx has been upgraded and recertified by Acrolinx GmBH. This action marks the latest milestone in the strategic relationship between TransPerfect and Acrolinx that began in 2013.

Astoria is an SaaS platform for building, managing, and publishing XML content, and Acrolinx is a platform for content quality optimization. In the joint solution, Acrolinx analyzes Astoria client content for spelling, grammar, style, terminology, reuse, search-engine optimization, and simplified English. By calculating a quality score for content components that reside in Astoria servers, Acrolinx enables Astoria users to automate and unclog typical bottlenecks in the copy-editing operation. Benefits include localization savings by reducing word count, simplifying sentence structure, and standardizing terminology, and editing efficiency improvements by automating checks in seven critical areas, including style, spelling, and grammar

The Astoria/Acrolinx integration includes both English and non-English content, enabling users to enter and service new markets with higher quality content in new languages faster and more efficiently.

https://www.transperfect.comhttp://www.globallinkccms.com

Sinequa adds neural search to Search Cloud

Enterprise search provider Sinequa announced the addition of advanced neural search capabilities to its Search Cloud Platform, for better relevance and accuracy to enterprises. As an optional capability of Sinequa’s Search Cloud platform, Neural Search uses four deep learning language models. These models are pre-trained and ready to use in combination with Sinequa’s Natural Language Processing (NLP) and semantic search.

Sinequa optimized the models and collaborated with the Microsoft Azure and NVIDIA AI/ML teams to deliver a high performance, cost-efficient infrastructure to support intensive Neural Search workloads without a huge carbon footprint. Neural Search is optimized for Microsoft Azure and the latest NVIDIA A10 or A100 Tensor Core GPUs to efficiently process large amounts of unstructured data as well as user queries.

Sinequa’s Neural Search improves relevance and is often able to directly answer natural language questions. It does this with deep neural nets that go beyond word-based search to better leverage meaning and context. Sinequa’s Search Cloud platform combines neural search with its extensive NLP and statistical search. This unified approach provides more accurate and comprehensive search results across a broader range of content and use cases.

https://www.sinequa.com/product-enterprise-search/neural-search/

Snowflake launches Unistore

Snowflake announced the launch of Unistore, a new workload that expands the capabilities of Snowflake and delivers a modern approach to working with transactional and analytical data together in a single platform. Unistore extends the Snowflake Data Cloud to streamline and simplify the development of transactional applications, while providing consistent governance, performance, and scale to customers.

Transactional and analytical data have typically been siloed, creating complexities when moving data between systems and hindering the speed required for modern development. With Unistore, teams can expand the Data Cloud to include transactional use cases such as application state and data serving. As a part of Unistore, Snowflake is introducing Hybrid Tables, which offer fast single-row operations and allow customers to build transactional business applications directly on Snowflake. Hybrid Tables, currently in private preview, enable customers to perform swift analytics on transactional data for immediate context, and join Hybrid Tables with existing Snowflake Tables for a holistic view across all data. Unistore and Hybrid Tables enable customers to build transactional applications with the same simplicity and performance they’re used to with Snowflake, and a unified approach to data governance and security.

https://www.snowflake.com/blog/introducing-unistore/

Adobe announces new Adobe Analytics services

Adobe announced new services in Adobe Analytics, delivering a single workspace for brands to unify data and insights across all media types. Adobe also introduced a new service to transition data from other analytics products while preserving historical compliance with regulations such as Global Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA).

Streaming media: Adobe is introducing new capabilities for brands to understand how streaming fits into the overall customer journey. Through Customer Journey Analytics (CJA), teams can tie digital media consumption to engagement on other channels like social media, websites and offline channels.

Seamlessly bring data together: With the bulk data insertion API now available, teams can move or activate any volume of historical data into Adobe Analytics. It covers any online or offline channel, allowing brands to transition data sources from point-of-sale devices, CRM systems and mobile applications.

Intelligent data mapping: Adobe Analytics is providing flexibility for brands to bypass the data migration preparation work while avoiding data destruction. As data comes through, Adobe Analytics preserves the underlying structure, and also suggests new ways to measure the customer journey. Brands can also retroactively apply dimensions to historical data, such as new attribution models.

https://news.adobe.com/news/news-details/2022/Next-Generation-Adobe-Analytics-Delivers-Customer-Insights-From-Streaming-Media-and-the-Metaverse/default.aspx

Canto unveils Media Delivery Cloud

Canto, a provider of digital asset management (DAM) software, released Media Delivery Cloud, a new solution that enables customers to directly connect images from their Canto library to their website, e-commerce platform and other content distribution platforms. With Media Delivery Cloud, companies can deliver images in real time at a global scale – reducing duplicate work between creative and web teams, eliminating the need to create and store duplicate assets, while optimizing web load times.

By publishing assets directly to e-commerce and web, Media Delivery Cloud enables brands to accelerate their digital asset supply chain and ensure consistency across markets. Media Delivery Cloud enables brands to:

  • Automate publication of digital assets directly from your Canto library to your website or e-commerce platform
  • View locally hosted content from servers close to users, with faster page load times and a better end-user experience
  • Remove duplication and cut down on storage costs by displaying a single asset in different formats
  • Auto-resize and crop imagery in the formats needed, removing the burden on creative teams

https://canto.com/product/media-delivery-cloud/

CAI releases digital content provenance tools

The Content Authenticity Initiative (CAI) released a suite of open-source developer tools – implementing the Coalition for Content Provenance and Authenticity (C2PA) specification released earlier this year to enable a broad developer community to integrate content provenance across web, desktop, or mobile projects, regardless of their location or level of technical depth with the comprehensive C2PA technical specification. The three tools are:

  • JavaScript SDK – This UI toolkit includes everything developers need to create rich, browser-based experiences displaying content credentials.
  • C2PA Tool – Developers can install this utility to create, verify, and explore content credentials on their command line, or wrap it into a service to quickly equip their processes to interact with content provenance.
  • Rust SDK – Developers can build custom applications across desktop, mobile, and services that create, verify, and display content credentials directly via our powerful Rust library.

CAI Releases Suite of Open-Source Tools to Advance Digital Content Provenance

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑