Hadoop online Training programs can make use of these details to run work with the node where the information is, and,failing that, about a single stand/switch, reducing backbone traffic. HDFS uses this technique when duplicating data to attempt to help keep distinct duplicates of the information on stands that are different. The aim would be to lessen the effect of a stand power outage or switch failure, so that even if these events happen, the data might still be able readable.A little Hadoop bunch features one master and multiple worker nodes. The master node includes a JobTracker, NameNode TaskTracker and DataNode.
http://hyderabadsys.com/sap-bo-online-training/