Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Oversea Career Opportunity !!! Customer Service

**JOB OVERVIEW**- The role interfaces with customers via inbound calls, outbound calls, or through the Internet depending upon client requirements. This posi...


From Timesconsult - Kuala Lumpur

Published 22 days ago

Technical Support Center Specialist

Job Description Summary**Job Description**:- We are the makers of possibleBD is one of the largest global medical technology companies in the world. _Advanci...


From Bd - Kuala Lumpur

Published 22 days ago

Customer Service Specialist

**Responsibilities**:- Answering queries and resolving client issues regarding our products, platforms, and promotion.- Address customer feedback as well as ...


From Ebc Group - Kuala Lumpur

Published 22 days ago

Finance Specialist

**Title**:Finance Specialist**Department**: General Ledger**Location**: Malaysia**(Job Responsibilities)**- Reconciling bank reconciliation- Managing fixed a...


From Manpowergroup - Kuala Lumpur

Published 22 days ago

Hadoop Developer

Hadoop Developer
Company:

Tata Consultancy Services


Details of the offer

Job Description Job Description:
Responsibilities:
The developer can write high quality ETL code for Hadoop projects. The candidate should be able to tweak the queries and work on performance enhancement. The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after getting it properly tested. The Candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects.
Skills Required:

1. Good development, analytical and technical expertise/Skilled in HDFS,HIVE,SQL/HQL, Spark, Scala, Python and other Hadoop ecosystems.

2. Experience in Unix Shell Scripting, Control M scheduler.
3. Minimum 2-4 years of development experience in Big Data technologies with overall 4-5 years of working experience. Possess a hunger for knowledge, self-initiative as well as willing to learn.

4. Good understanding of and experience in AGILE projects

5. Familiar with SDLC and banking domain is an added advantage.

6. Knowledge in Cloud would be preferable


Source: Whatjobs

Job Function:

Requirements

Hadoop Developer
Company:

Tata Consultancy Services


Built at: 2024-05-05T09:44:57.706Z