Remove hdfs-erasure-coding-in-production
article thumbnail

Hortonworks New Distribution Strategy and New Streaming Analytics

CTOvision

This week, Hortonworks announced a comprehensive strategy with new product advancements across its Connected Data Platforms, including Hortonworks Data Platform (HDP™), and Hortonworks DataFlow (HDF™). as well as the announcement of HDF 1.2. Progress Report: Bringing Erasure Coding to Apache Hadoop (cloudera.com).

article thumbnail

The Good and the Bad of Hadoop Big Data Framework

Altexsoft

Hadoop distributed file system or HDFS is a data storage technology designed to handle gigabytes to terabytes or even petabytes of data. By default, HDFS chops data into pieces of 128M except for the last one. HDFS is based on the write once, read many times principle. What is Hadoop? Source: Allied Market Research.