Senior Big Data Devops Engineer

Senior Big Data Devops Engineer
Company:

Experian (Malaysia) Sdn Bhd


Senior Big Data Devops Engineer

Details of the offer

Job Description As a key aide to both the IT Infrastructure and Development teams, you will help support existing systems 24x7 and responsible for administering current Big Data environments. The candidate will be responsible for managing BigData Cluster environments and will work with teammates to maintain, optimize, develop, and integrate working solutions for our big data tech stack. To support the product development process in line with the product roadmap for product maintenance and enhancement such that the quality of software deliverables maintains excellent customer relationships and increases the customer base.
If you have the skills and ?can do? attitude, we would love to talk to you!
What you?ll be doing
Responsible for implementation and ongoing administration of Hadoop infrastructure
Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
Expert knowledge with delivering Big Data Cloudera Solutions in the cloud with AWS or Azure.
Hands-on day-to-day expert experience in administering a Cloudera cluster with Cloudera Manager, Cloudera Director, Cloudera Navigator
Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, HBase and Yarn access for the new users
Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Enterprise, etc.
Performance tuning of Hadoop clusters and Hadoop MapReduce routines
Screen Hadoop cluster job performances and capacity planning
Monitor Hadoop cluster connectivity and security
Manage and review Hadoop log files, File system management and monitoring
HDFS support and maintenance
Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
Collaborating with application teams to perform Hadoop updates, patches, version upgrades when required
General operational expertise such as good troubleshooting skills, understanding of system?s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
? The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups
Solid Understanding on premise and Cloud network architectures

Requirements


Knowledges:

  • Linux
  • Access

Junior web content analyst

Do you have an interest in web content classification and cybersecurity products? Have you ever wondered how parental control works? Are you interested to learn...


From F-Secure Corporation (M) Sdn. Bhd. - Kuala Lumpur

Published a month ago

Senior analyst programmer sap ui5/abap

Design, develop, test, document and support programs and reports for SAP modules and functions using ABAP/UI5 programming language. Design forms and screens for...


From Indah Water Konsortium Sdn. Bhd. - Kuala Lumpur

Published a month ago

Project managers - consulting services

Responsibilities: Manage and work with a team of Enterprise Architecture (EA) consultants in the entire Project Management Life Cycle based on TOGAF, ArchiMate...


From Atd Solution (M) Sdn Bhd - Kuala Lumpur

Published a month ago

System analyst (kota kemuning, rm 4,000 - rm 6,000)

Responsibilities: Work together with the Product Owner to plan and execute projects, ensure teams have appropriate product & technical specifications, direction...


From Capita Penang - Kuala Lumpur

Published a month ago