Hadoop Administrator/Engineer in Charlotte, North Carolina at AccruePartners

Date Posted: 7/27/2020

Job Snapshot

Job Description

AccruePartners values our contract and consulting employees. We offer a competitive benefits package to meet the diverse needs of all of our contractor and consulting employees and their family members. Here is a listing of what our company offers: 401(k) Medical, Dental, Vision, Life Insurance, Employee Assistance Program, Medical and Prescription Drug, Short and Long-Term Disability Insurance.

THE TEAM YOU WILL BE JOINING:

  • Fortune 100 Financial Services Company
  • 100-year history of dedication to customer satisfaction, success and growth
  • Tremendous growth and new business strategy leading to the need for new talent
  • Significant investments in cutting-edge technology

WHAT THEY OFFER YOU:

  • Culture: Excellent work environment that fosters collaboration
  • Growth: Ability to make an impact on the direction of the organization
  • Opportunity: Gain hands-on experience working with cutting-edge technology
  • Stability: Recent financial performance of the company has reported record profits

WHERE THE POSITION IS LOCATED:

  • Charlotte, NC or Iselin, NJ

WHY THIS ROLE IS IMPORTANT:

  • Work as admin on Hadoop ecosystem components like DHFS, hive, map-reduce, yarn, impala, spark, Sqoop, HBase, Sentry, Hue and Oozie Installation, configuration and Upgrading Cloudera distribution of Hadoop Exposure to Kafka and Apache NIFI
  • Responsible for implementation and on-going administration of Hadoop Infrastructure
  • Work on Hadoop security aspects including Kerberos setup, RBAC authorization using Apache Sentry File system management and cluster monitoring using Cloudera Manager
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Troubleshooting involving map reduce, yarn, sqoop job failure and its resolution
  • Analyze multi-tenancy job execution issues and resolve
  • Backup and disaster recovery solution for Hadoop cluster Troubleshooting connectivity issues between BI tools like Datameer, SAS and Tableau and Hadoop cluster
  • Work with data delivery teams to setup new Hadoop users. (Job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users)
  • Point of contact for vendor escalation ; be available for 24*7 Hadoop support issues
  • Participate in new data product or new technology evaluation; manage certification process.
  • Evaluate and implement new initiatives on technology and process improvements.
  • Interact with Security Engineering to design solutions, tools, testing and validation for controls

THE BACKGROUND THAT FITS:

  • Bachelors Degree or equivalent work experience in Computer Science, Information Systems, or other related field.
  • Candidate will have 1 to 2 year experience with Hadoop data stores/cluster administration.
  • Experience in relational database experience.
  • Excellent performance and tuning skills of large workload inside Hadoop cluster
  • Scripting Skills - Shell and Python
  • Experience in upgrading Cloudera Hadoop distributions is preferred
  • Experience in performance tuning and troubleshooting - drill down approach with O/S, database and application - End to End Application connectivity
  • Familiarity with NoSQL data stores (MongoDB / Cassandra/HBase)
  • Familiarity with Cloud Architecture (Public and Private clouds) - AWS , AZURE familiarity
  • Prior experience of administration of Teradata or any other relational databases.