Remove Business Intelligence Remove Data Engineering Remove Open Source Remove Survey
article thumbnail

The state of data quality in 2020

O'Reilly Media - Ideas

Those suspicions were confirmed when we quickly received more than 1,900 responses to our mid-November survey request. The responses show a surfeit of concerns around data quality and some uncertainty about how best to address those concerns. Key survey results: The C-suite is engaged with data quality.

article thumbnail

Core technologies and tools for AI, big data, and cloud computing

O'Reilly Media - Ideas

This concurs with survey results we plan to release over the next few months. In a forthcoming survey, “Evolving Data Infrastructure,” we found strong interest in machine learning (ML) among respondents across geographic regions. Companies are embracing AI and data technologies in the cloud.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

The Modern Data Stack: What It Is, How It Works, Use Cases, and Ways to Implement

Altexsoft

Additionally, this modularity can help prevent vendor lock-in, giving organizations more flexibility and control over their data stack. Offered as open-source with active support by communities. Many components of a modern data stack (such as Apache Airflow, Kafka, Spark, and others) are open-source and free.

Data 59
article thumbnail

The Good and the Bad of Apache Spark Big Data Processing

Altexsoft

Maintained by the Apache Software Foundation, Apache Spark is an open-source, unified engine designed for large-scale data analytics. With its native support for in-memory distributed processing and fault tolerance, Spark empowers users to build complex, multi-stage data pipelines with relative ease and efficiency.

article thumbnail

The Good and the Bad of Docker Containers

Altexsoft

Docker is an open-source containerization software platform: It is used to create, deploy and manage applications in virtualized containers. Launched in 2013 as an open-source project, the Docker technology made use of existing computing concepts around containers, specifically the Linux kernel with its features.