Hadoop Admin and Support in Charlotte, North Carolina at AccruePartners

Date Posted: 8/10/2019

Job Snapshot

Job Description

AccruePartners values our contract and consulting employees. We offer a competitive benefits package to meet the diverse needs of all of our contractor and consulting employees and their family members. Here is a listing of what our company offers: 401(k) Medical, Dental, Vision, Life Insurance, Employee Assistance Program, Medical and Prescription Drug, Short and Long-Term Disability Insurance.

WHO OUR CLIENT IS:

  • Fortune 100 Financial Services Company
  • 100-year history of dedication to customer satisfaction, success and growth
  • Tremendous growth and new business strategy leading to the need for new talent
  • Significant investments in cutting-edge technology

WHY YOU SHOULD CONSIDER THIS OPPORTUNITY:

  • Culture: Excellent work environment that fosters collaboration
  • Growth: Ability to make an impact on the direction of the organization
  • Opportunity: Gain hands-on experience working with cutting-edge technology
  • Stability: Recent financial performance of the company has reported record profits

WHERE THE POSITION IS LOCATED:

  • Charlotte, North Carolina

WHAT YOU WILL DO:

  • Manage application handling data across development, testing and production environments
  • Execute and monitor Hadoop jobs, analyze Hadoop job failures, fix bugs, and address environment issues like performance tuning and enhancements
  • Manage and administer business projects to ensure the business requirements and configurations are in place
  • Work on database change requests, incidents, service access requests and change requests
  • Be responsible for users, roles, profiles and databases in all environments
  • Promote projects and jobs between development, unit testing, integration testing, system testing, performance testing and production servers using elevation requests
  • Set up and clean environments
  • Develop test plans and test cases
  • Plan, execute, and report system testing

HOW YOU ARE QUALIFIED:

  • 7 years of experience
  • Required skills in Kafka, NiFi, Pig, Hive, Java, Map Reduce, Sqoop, Spark, Splunk, Hue, Impala, Tableau, Tex, Ambari, Python, Yarn, and Autosys scheduling
  • Required skills in Hadoop, Hadoop distributed file system, Hadoop basics including Keyberos authentication