Architecture Of Apache Flume

Architecture Of Apache Flume Tutorial


The architecture of Apache Flume:


Flume is robust, flexible, standard, simple, and extensible tool For data breakdown from various data producers into Hadoop, in The Apache flume, we will use a simple example to explain the Basics of Apache Flume.The professionals who would like to learn The process of transferring streaming and log data from various Web servers to HBase using Apache Flume, to make the most of this Apache flume, you should have good understanding of the basics Of HDFS and Hadoop commands


What is Flume and Hadoop online training in Hyderabad?


Apache Flume is tool breakdown mechanism for collecting transporting And aggregating large amounts of streaming data such as events, Log files from different sources to the centralized data store. Flume is distributed, reliable, and configurable tool, It is designed To copy streaming data from various web servers to HDFS.Apache flume is easy when learning Hadoop. Kosmik Technologies
Provides Hadoop online training in Hyderabad, experienced faculty In Hadoop classes, easy to learn Hadoop in different ways

Apache Flume Configuration:

After installing Flume, we have to configure it using the configuration The file which is Java property file having key value pairs. We need to Pass values to the keys in the file.

In the Flume configuration file, we need to Name the components of the Apache flume.

> Describe or configure the source.

> Describe or configure the sink.

> Describe or configure the channel.

> Bind the source and the sink to the channel.


Features of Flume:

Some of the features of Flume are given below

1. Flume can be scaled.

2. Using Flume, we can get the data from many servers immediately into Hadoop.

3. Flume log data from many web servers into a centralized store.

4. Flume supports the large set of sources and destinations types.

5. Flume supports multi-hop flows, contextual routing, fan in & fan outflows etc.

6. Flume is used to import big volumes of event data produced by social networking

Sites like Twitter and Facebook, Along with the log files.

Applications of Flume:

Assume an e-commerce web application wants to analyze the customer behavior from Particular region so they would need to move the available log data into Hadoop for analysis

Flume is used to move the log data generated by application servers into HDFS at Higher speed

Advantages of Flume:


1. Here are the advantages of using Flume

2. Flume provides the feature of contextual routing.

3. Using Apache Flume we can store the data into any of the centralized stores.

4. Flume is fault tolerant, reliable, manageable, scalable, and customizable.

5. The transactions in Flume channel based where two transactions maintained for each message.

Share this

Related Posts

Previous
Next Post »