One of the classic architecture when using a Kerberos KDC and OpenLdap is to store the KDC database in OpenLdap backend.You can find a procedure to do that on the MIT Kerberos documentation site The advantage of storing your KDC in OpenLdap backend is to take advantage of the OpenLdap replication process that is easy…
Sometimes it could be difficult to identifying the activities that are stressing your namenodes. Following is an article on how to have the top users by hdfs actions on your HDFS Cluster requesting the Namenode JMX
When having trouble with your Big Data cluster, it could be interesting to automatically generate some monitoring for your namenode.
In this article, I will describe the impact of the Hive ACID activation on a running cluster HDP 2.6.4
In this article I will show you how to add custom metrics in the Hortonworks Ambari Metrics for following Zookeeper health with Grafana.
In this article we will discover Zookeeper 4 letters commands to help us to monitor Zookeper
This is a howto for installing Hadoop (hdfs and MapReduce), Yarn and HBase on your Linux box in 10 minutes (after the binary download).
I have updated my python Api for managing Hortonworks Ambari and Ranger with their API.
There is a modification since Ansible 2.5 with the way that include_tasks works with become_user