Integrating Hadoop with LVM to provide elasticity to cluster

shreyash kotgire
3 min readMar 15, 2021

--

In this article we are integrating hadoop with LVM to provide custom elasticity to hadoop cluster. so that we can increase or decrease storage according to over requirement without affecting data.

for this demonstration i am creating a cluster locally with the help of virtual machine box with Red hat enterprise linux 8 as base os . For storage attaching the virtual storage of 10GB to the slave node.

Now let’s start the node and verify that is storage connected or not

As u can see the disk is connected i.e /dev/sdb

now we have to implement the LVM in slave node for this we need to create logical volume, volume group and persistent volume

As logical volume is created let format this volume and mount it to the folder so hdaoop can use it storing data . here we are using ext4 file system format .

as u can see data node is started for cluster below is configuration of data node

Now let’s see configuration of namenode i.e controller node for cluster with IP 192.168.43.30

So our cluster is ready and in running status so we can check how much storage it has

As u can see we have created cluster with storage size equal to 5 GB now we if sudden requirement increased we can increase our storage of cluster on runtime with the help of LVM . just increase size of logical volume from our volume group

here we added more 3 GB of storage to our data node so it should be donated to cluster

see it’s working without any !!

This type of application are very useful when there’s sudden requirement comes and if u want flexiblity in your storage without any downtime.

Thank you for reading …!

--

--