Curated for content, computing, and digital experience professionals

Category: Enterprise software & integration (Page 12 of 32)

Algolia acquires Search.io

Algolia, an API-First Search & Discovery Platform, announced the acquisition of Search.io, whose flagship product is Neuralsearch – a vector search engine that uses hashing technology on top of vectors to provide price performance at scale. Algolia will combine its keyword search and Search.io’s Neuralsearch into a single API-First Search and Discovery platform with a hybrid search engine, which comprises both keyword and semantic search in a single API.

The combination of Algolia (with its keyword search) and Search.io (with its vector-based semantic search), enables Algolia to more effectively surface the most accurate and relevant results for users, whether they use specific keywords or natural human expressions. Many companies claim to offer some form of semantic search, however, these companies may not offer the capabilities of keyword search and vector-based semantic search in a single API cost-effectively, or the ability to scale. In essence, Algolia provides users with the ability to search as they think. With Search.io, Algolia aims to empower business users with a better way to manage the automation of unique and engaging end user experiences.

https://www.algolia.com/about/news/algolia-disrupts-market-with-search-io-acquisition-ushering-in-a-new-era-of-search-and-discovery/

Apollo GraphQL & MongoDB create stack for app developers

Apollo GraphQL and MongoDB, Inc., announced a technology partnership that helps app developers build richer experiences faster, and reduce technical debt with a graph-native data layer. The partnership makes it easier for developers and teams to directly connect any supergraph powered by Apollo to a MongoDB Atlas database. Together, an Apollo supergraph and MongoDB Atlas create a composable and scalable GraphQL data layer. It provides developers with everything they need to efficiently use GraphQL:

  • A unified API, so app developers can rapidly create new experiences
  • A modular API layer, so each team can independently own their slice of the graph
  • A seamless, high-performance, flexible data layer that scales alongside API consumption

MongoDB’s flexible database paired with the GraphQL query language allows developers to work with the database in the language of their choice with a standardized spec that has large community adoption. With the nested document model, developers can model and query data intuitively without the complexity of mapping GraphQL to relational data and defining relationships across tables. When used with MongoDB Atlas’s multi-region and multi-cloud capabilities, an Apollo supergraph gives its developers a GraphQL layer to create end-user experiences for their apps and services.

https://www.apollographql.comhttps://www.mongodb.com

MariaDB and MindsDB collaborate on machine learning

MariaDB Corporation and MindsDB, a provider of in-database machine learning tools, together announced a technology collaboration that makes machine learning predictions easy and accessible to cloud database users.  By using MindsDB in SkySQL, MariaDB’s fully managed cloud database service, data science and data engineering teams can increase their organization’s predictive capabilities to plan for and address business issues. MariaDB database users will now be able to add machine learning based predictions directly into their datasets stored in SkySQL. This simplifies the task of analyzing and predicting future trends, putting machine learning capabilities into the hands of MariaDB users, no matter their role. The use cases for business predictions cut across every business function such as finance, sales, risk analysis, logistics, operations, and marketing.

https://mindsdb.comhttps://mariadb.com/products/skysql/

Komprise automates unstructured data discovery with Smart Data Workflows

Komprise announced Komprise Smart Data Workflows, a systematic process to discover relevant file and object data across cloud, edge and on-premises datacenters and feed data in native format to AI and machine learning (ML) tools and data lakes.

Komprise has expanded Deep Analytics Actions to include copy and confine operations based on Deep Analytics queries, added the ability to execute external functions such as running natural language processing functions via API and expanded global tagging and search to support these workflows. Komprise Smart Data Workflows allow you to define and execute a process with as many of these steps needed in any sequence, including external functions at the edge, datacenter or cloud. Komprise Global File Index and Smart Data Workflows together reduce the time it takes to find, enrich and move the right unstructured data. Komprise Smart Data Workflows are relevant across many sectors. Here’s an example from the pharmaceutical industry.

https://www.komprise.com/komprise-automates-unstructured-data-discovery-with-smart-data-workflows/

Cloudflare and open source community to create new API standards

Cloudflare, Inc. announced that it is collaborating with Deno and individual core contributors of the Node.js open source project, bringing together three of the largest JavaScript environments, to give developers flexibility and choice while creating the standards of the future of edge computing. By collaborating around a common set of standards, the effort will aim to ensure code developed in one environment will work in another.

The Web-interoperable Runtimes Community Group (or “WinterCG”) is working with organizations including NearForm and Vercel to ensure that developers’ voices were heard in the creation of a new community group working within existing standards bodies. The API Standards allow developers to:

  • Use the best tool or framework for the job: It will be easier to leverage tools and integrations from the community across runtimes, allowing developers to use the best tool for the job.
  • Have a uniform approach to writing server side code: By removing platform specific nuances and the need to learn different platforms and focusing on functionality it’s easier for developers to ship better code.
  • Move applications as technology needs change: As application needs evolve and change over time there is no need for massive re-writes and adding or switching vendors.

https://blog.cloudflare.com/introducing-the-wintercg/

Netlify Edge Functions accelerate web development at the edge

Netlify, a platform for modern web development, announced Netlify Edge Functions, bringing standards-based edge compute to Netlify’s development workflow. Developers can now build fast web experiences in less time, using Edge Functions to run dynamic content or even an entire application from the network edge without compromising performance. Built on Deno, an open source runtime, Edge Functions work out-of-the-box with new server-side features from existing web frameworks like Next.js, Nuxt, Astro, Eleventy, and SvelteKit as well as new edge-first frameworks like Hydrogen and Remix.

The recent macrotrend of edge computing has led to a wave of innovation at the network edge, but many of these new solutions are proprietary, don’t use popular programming languages, and don’t offer integrations with multiple web frameworks. As a result, edge compute has added substantial complexity to the software development lifecycle. Netlify Edge Functions were built to be an antidote, letting development teams avoid this tradeoff and, ultimately, deliver modern web experiences to market much faster.

Netlify’s suite of serverless capabilities – Netlify Functions, Background Functions, Scheduled Functions, and now Edge Functions – give developers the flexibility to apply compute where and when they need it. Netlify Edge Functions is now available in public beta.

https://www.netlify.com/blog/announcing-serverless-compute-with-edge-functions

Google announces BigLake, to unify data lakes and data warehouses across clouds

From the Google Cloud blog…

The volume of valuable data that organizations have to manage and analyze is growing at an incredible rate. This data is increasingly distributed across many locations, including  data warehouses, data lakes, and NoSQL stores.

Today, we’re excited to announce BigLake, a storage engine that allows you to unify data warehouses and lakes. BigLake gives teams the power to analyze data without worrying about the underlying storage format or system, and eliminates the need to duplicate or move data, reducing cost and inefficiencies. With BigLake, users gain fine-grained access controls, along with performance acceleration across BigQuery and multicloud data lakes on AWS and Azure. BigLake also makes that data uniformly accessible across Google Cloud and open source engines with consistent security. BigLake enables you to:

  • Extend BigQuery to multicloud data lakes and open formats such as Parquet and ORC with fine-grained security controls, without needing to set up new infrastructure.
  • Keep a single copy of data and enforce consistent access controls across analytics engines of your choice, including Google Cloud and open-source technologies such as Spark, Presto, Trino, and Tensorflow.
  • Achieve unified governance and management at scale through seamless integration with Dataplex.

https://cloud.google.com/blog/products/data-analytics/unifying-data-lakes-and-data-warehouses-across-clouds-with-biglake

Airbyte Cloud now available in U.S.

Airbyte, creators of an open-source data integration platform, made available in the U.S. its cloud service for data movement and unifying data integration pipelines. Airbyte Cloud’s pricing model is based on compute time, which can be less expensive than the industry-norm volume-based pricing which is cost prohibitive when replicating high volumes of data.

Airbyte’s open-source data integrations focused on solving two problems: First, companies have to build and maintain data connectors on their own because most less popular “long tail” data connectors are not supported by closed-source ELT technologies. Second, data teams often have to do custom work around pre-built connectors to make them work within their unique data infrastructure. In addition to providing hosting and management, Airbyte Cloud enables companies to have multiple workspaces and provides access management for their teams.

The company also announced cooperation with open-source maintainers within its user community. Airbyte will provide compensation for helping deliver new features and bug fixes for the continuously-growing list of data connectors. Contributors can earn money for work on data connectors for finding software bugs, and for bug fixes.

https://airbyte.com

« Older posts Newer posts »

© 2025 The Gilbane Advisor

Theme by Anders NorenUp ↑