Sai Sandeep Dudam
5 years of experience in IT industry including Big Data - Hadoop technologies.
Collaborate closely with platform engineering team to create, manage, upgrade, and secure Hadoop clusters
Create Hadoop ecosystem (Hadoop, Hive, Pig, Oozie, Hbase, Ambari, HDP) using both automated toolsets as well as manual processes.
Maintain, support, and upgrade Hadoop clusters.
Monitor jobs, queues, and HDFS capacity.
Apply security (Kerberos) linking with Active Directory.
Configuration, access control, disk quota, permissions etc.
Address all issues, apply upgrades and security patches.
Commission/decommission nodes backup and restore.
Apply "rolling" cluster node upgrades in a Production-level environment.
Work with platform engineering team to provision / manage HDP cluster components.
Provide guidance to junior engineers to implement services and systems that are highly available, scalable, and self-recoverable on cloud platforms
Introduce emerging technologies, tools & processes to drive continuous innovation and greater business value
Work in conjunction with an expert team of Decisions Scientists and Data Engineers in a fun filled and highly energetic environment to build world class Data Science solutions for clients
Follow Sai Sandeep