Job description
Location: Bangalore/Pune
Experience: 8 - 12 Years
Type: Work from Office
Salary: 10- 25 LPA
Notice Period: Immediate to 30 Days
We are looking to hire a talented Hadoop Administrator to join our dynamic team at NetConnect Global. As a Hadoop Administrator, you will be responsible for installing, configuring, and maintaining our Hadoop ecosystem. In addition, you will work closely with developers and business analysts to ensure that our Hadoop cluster is running efficiently.
If you are a Hadoop Administrator passionate about administration and thrive in a fast-paced environment, apply for this position. You can work with a team of talented professionals on cutting-edge big data projects.
Skills Required:
- Minimum 6+ years of Relevant experience in Hadoop administration
- Hadoop skills like HBase, Hive, Pig, Mahout, etc.
- Good knowledge of Linux, as Hadoop runs on Linux.
- Familiarity with open-source configuration management and deployment tools such as Puppet or Chef and Linux scripting.
- Knowledge of Troubleshooting Core Java Applications is a plus.
Responsibilities:
- You should be able to deploy the Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the group, configure name-node high availability, schedule and configure it and take backups.
- The ideal candidate must have general operational expertise, such as good troubleshooting skills understanding of the systems capacity, bottlenecks, and basics of memory, CPU, OS, storage, and networks.
- Responsible for implementation and ongoing administration of Hadoop infrastructure.
- Align with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing settings.
- Work with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
- Cluster maintenance, creation, and removal of nodes using tools like Cloudera Manager Enterprise, Dell Open Manage, and others must be taken care of by you.
- You will be required to undergo performance tuning of Hadoop clusters and Hadoop MapReduce routines.
- Screen Hadoop cluster job performances and capacity planning
- Monitor Hadoop cluster connectivity and security.
- Manage and review Hadoop log files.
- File system management and monitoring.
- HDFS support and maintenance.
- You must diligently team with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
- Collaborate with application teams to install the operating system and Hadoop updates, patches, and version upgrades when required.
- Point of Contact for Vendor escalation.
What We Offer:
- Career and competence support.
- Clearly defined career paths
- Personal Accident Policy
- Paid Maternity Leave and Paternity Leave
- Employee Assistance Program
- Gratuity
- Relocation Assistance
- Open Door Policy
- Disability Income Protection
- Equal Employment Opportunity