News
Subsequently, Databricks announced that it would open source Delta Lake, both in terms of the file format and protocol, as well the Spark SQL implementation of same.
Databricks is open sourcing Delta Lake to counter criticism from rivals and take on Apache Iceberg as well as data warehouse products from Snowflake, Starburst, Dremio, Google Cloud, AWS, Oracle ...
Databricks, maker of the Apache Spark machine learning library, today announced the creation of Delta Lake, a tool for making more reliable data lakes.
Delta Lake, Ghodsi said, is essentially the data layer of the Lake House pattern. It brings support for ACID transactions to data lakes, scalable metadata handling and data versioning, for example.
Databricks unveiled a new edition of its Delta Lake data storage format Wednesday that the data lakehouse platform developer says eliminates data silos through its new Universal Format capability.
Databricks, which had faced criticism of running a closed lakehouse, is open sourcing most of the technology behind Delta Lake, including its APIs, with the launch of Delta Lake 2.0. That was one of a ...
As companies’ data appears in cloud storage from different sources, Databricks Ingest automatically pulls this new data efficiently into Delta Lake.
Big data firm Databricks Inc. wants to clean up companies’ messy data lakes with a new open-source project. Delta Lake, as the project is called, acts similarly to a regular data lake but ...
Databricks announces launch of open source project called Delta Sharing at its Data + AI Summit, with backing of data providers and business intelligence tool suppliers.
By making Delta Lake open source, developers will be able to easily build reliable data lakes and turn them into ‘Delta Lakes’,” said Ali Ghodsi, co-founder and CEO at Databricks.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results