Etl Developer

6 - 8 years

13 - 23 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


  • ETL Tools: Talend (Nice to have)

  • Database: Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL server and Casandra,

  • Big Data and Amazon Services: Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache Spark

  • Data Modeling Tools: Archimate (not mandated- secondary/preferred), Erwin, Oracle Data Modeler (secondary/preferred)

  • Scheduling Tools: Autosys, SFTP, AirFlow (preferred. This should not be an issue, any resource can learn how to use it)

Note: (Yellow highlighted skills are mandatory) Non highlighted, even if they dont have, they should learn once after onboarded

Key Responsibilities:

  • Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker Apache Spark.

  • Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or data lakes.

  • Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.

  • Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.

  • Troubleshooting and resolving issues related to data processing and ETL workflows.

  • Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.

  • Documenting ETL processes, data mappings, and system architecture.

  • Implementing security measures such as IAM roles and access controls.

  • Diagnosing and resolving issues related to AWS services, infrastructure, and applications.

  • Proficiency in Big data tool and AWS services: Including Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache Spark relevant to data storage and processing.

  • Strong SQL skills: For querying databases and manipulating data during the transformation process.

  • Programming and scripting proficiency: Primarily Python, for automating tasks, developing custom transformations, and interacting with AWS services via SDKs and APIs.

  • Data warehousing and modeling expertise: Understanding data warehousing concepts, dimensional modeling, and schema design to optimize data storage and retrieval.

  • Good to have Experience with ETL tools and technologies: Talend

  • Data quality management skills: Ensuring data accuracy, completeness, and consistency throughout the ETL process.

  • Familiarity with DevOps practices: Including CI/CD pipelines and infrastructure as code

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

hyderabad, chennai, bengaluru

pune, chennai, bengaluru

hyderabad, chennai, bengaluru