category

Home / DeveloperSection / Category

Skills that matter- Hadoop
Skills that matter- Hadoop

The world around us is changing. Traditional approaches of doing business are giving way to modern methods. Most of the transactions that take place today are over the internet and ecommerce has slowly risen to prominence.

Pig Latin Operators in Hadoop
Pig Latin Operators in Hadoop

Pig Latin has a simple syntax with powerful semantics we will use to carry out two primary operations:

Hadoop integration with R
Hadoop integration with R

Developers and Programmers are still continue to explore various approaches to leverage the distributed computation benefits of MapReduce and the almost limitless storage capabilities of HDFS in intuitive manner that can be exploited by R.

Pig Architecture and Application Flow in Hadoop
Pig Architecture and Application Flow in Hadoop

Simple” often sense as “elegant” when it comes to those remarkable architectural drawings for that new Silicon Valley mansion we have planned for when the money starts rolling in after we implement Hadoop.

Managing files with Hadoop File System Commands
Managing files with Hadoop File System Commands

HDFS is one of the two main components of the Hadoop framework; the other is the computational paradigm known as MapReduce.

Data Replication in Hadoop: Replicating Data Blocks (Part – 1)
Data Replication in Hadoop: Replicating Data Blocks (Part – 1)

In HDFS, the Data block size needs to be large enough to warrant the resources dedicated to an individual unit of data processing On the other hand.

Hadoop Java API for MapReduce
Hadoop Java API for MapReduce

Hadoop has gone through some big API change in its 0.20 release, which is the basic interface in the 1.0 version .

Input Splits and Key-Value Terminologies for MapReduce
Input Splits and Key-Value Terminologies for MapReduce

As we already know that in Hadoop, files are composed of individual records, which are ultimately processed one-by-one by mapper tasks.

Concept of Data compression in Hadoop
Concept of Data compression in Hadoop

The massive data volumes that are very command in a typical Hadoop deployment make compression a necessity.

Importance of Map Reduce in Hadoop
Importance of Map Reduce in Hadoop

From the beginning of the Hadoop’s history, MapReduce has been the complete game changer in town when it comes to deal with data processing.