- : Create analysis and design document, Unit and Integration test plan based on SAD (Solution Architect Document) as per the principle of Agile Scrum Methodology
- : Develop ETL jobs using Java, Java routines, Talend Studio 5.6.2, Oracle, Teradata, DB2 and Splunk
- : Capture Full and Incremental data feeds from different Source using IBM CDC and secure transfer it to the Data fabric landing zone
- : Performance tuning on the Map reduce jobs, Hadoop 2.4.1, HDFS, Hbase 0.94.21, Hive 0.13 and Talend jobs
- : Design and develop jobs to extract entity name from the file name using Custom Talend Java routine and UNIX shell scripting with Regex pattern
- : Create jobs to expose Big data files in Hadoop HDFS to Hive with the respective schema to ease Data extraction and report creation;
Qualification:
- : Hold minimum Bachelor’s Degree in Comp. Sci. /Info. Sys. /Math’s and/or any Engineering with 2 years of relative experience (U.S Equivalent education accepted). Extended Travel/relocation required to unanticipated client locations in the USA.
- : Standard Company Benefits.
: Posted 24-APR-2017.

Apply for this Position
Upload your cover letter and resume (pdf, doc, docx, jpg)