Primary skills: Big Data
Years of experience: 8+ Years
- Total experience: 8+ years out of which 5+ years should be in Big Data systems.
- Strong experience with Designing, creating, optimizing (Cloud or on- premise) Hadoop platforms
- Knowledge and experience with at least 2 Cloud platforms (AWS, Azure, or Google Cloud Platform)
- Experience of designing Hadoop clusters, sizing of clusters and designing data ingestion mechanisms is a must.
- HBase experience will be an added advantage.
- Strong Experience with Apache Spark, Storm, Kafka is a must.
- Experience with Python, Pig, Hive, Kafka, Knox, Tomcat and Ambari DevOps experience ie.. (Git + CI/CD + IaC)
- Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet