In this post I will describe my journey during a migration from Hortonworks HDP 2.6 to Cloudera CDP 7.1. I have to export the Hbase tables from an old and less secure cluster to a more recent and secure cluster. The application that used the Hbase tables can’t stop for a long time and have…
Sometimes you need to : iterate a payload call an API for each item filtering the json result using json_query returning a final result with all the filtered result With this result you can iterate for another API. This article talk about how to do that using Ansible, json_query filter
One of the classic architecture when using a Kerberos KDC and OpenLdap is to store the KDC database in OpenLdap backend.You can find a procedure to do that on the MIT Kerberos documentation site The advantage of storing your KDC in OpenLdap backend is to take advantage of the OpenLdap replication process that is easy…
Sometimes it could be difficult to identifying the activities that are stressing your namenodes. Following is an article on how to have the top users by hdfs actions on your HDFS Cluster requesting the Namenode JMX
When having trouble with your Big Data cluster, it could be interesting to automatically generate some monitoring for your namenode.
In this article, I will describe the impact of the Hive ACID activation on a running cluster HDP 2.6.4
In this article I will show you how to add custom metrics in the Hortonworks Ambari Metrics for following Zookeeper health with Grafana.
In this article we will discover Zookeeper 4 letters commands to help us to monitor Zookeper
This is a howto for installing Hadoop (hdfs and MapReduce), Yarn and HBase on your Linux box in 10 minutes (after the binary download).