Export Hbase tables from HDP 2.6 to CDP 7.1
In this post I will describe my journey during a migration from Hortonworks HDP 2.6 to Cloudera CDP 7.1. I have to export the Hbase tables from an old and less secure cluster to a more recent and secure cluster. The application that used the Hbase tables can’t stop for a long time and have…
Hdfs top users by actions
Sometimes it could be difficult to identifying the activities that are stressing your namenodes. Following is an article on how to have the top users by hdfs actions on your HDFS Cluster requesting the Namenode JMX
Namenode monitoring
When having trouble with your Big Data cluster, it could be interesting to automatically generate some monitoring for your namenode.
Installing Hadoop, Yarn and HBase on your Linux in 10 minutes
This is a howto for installing Hadoop (hdfs and MapReduce), Yarn and HBase on your Linux box in 10 minutes (after the binary download).
BigData – My BigDataApi
I have updated my python Api for managing Hortonworks Ambari and Ranger with their API.
Ansible – Modify and join a list
So you have a list with a number of item which are the /data* path where you want to put your hadoop hdfs data and you want to add after each item the end of the path mount point, here is a solution.