Home
Jobs

3 Etl Migration Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

20 - 30 Lacs

Pune

Work from Office

Naukri logo

We are seeking a seasoned Technical Project Manager to oversee and guide large service engagements, involving teams of 35-50 individuals. This role requires a balance of technical know-how, exceptional leadership abilities, and proven project management skills.

Posted 3 weeks ago

Apply

3 - 5 years

18 - 20 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Build and maintain ETL pipelines using Azure Data Factory, Dataflows etc. Work with teams to understand and solve data needs (Ingestion, transformation, integration) Manage data lakes and data warehouses in Azure Optimize ADF/Spark jobs for speed and cost Proficient in Microsoft Azure services Automate data workflows using Azure Data Factory (ADF) pipelines Troubleshoot and fix ADF job issues Stay updated on new features in Azure. Good to have knowledge or any ETL tool like SSIS, Informatica, DataStage. Good to have ETL migration experience Strong analytical and problem-solving skills. Excellent writing skills, with the ability to create clear requirements, specifications, and Documentation Proficient in SQL. Willing to work on R & D project and various other technologies Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Remote

Posted 2 months ago

Apply

2 - 7 years

3 - 6 Lacs

Mumbai

Work from Office

Naukri logo

1. Good Knowledge of S3. Manager 2. DataLake Concepts and Performance Optimizations in DataLake 3. DataWareHouse Concepts and Amazon Redshift. 4. Athena and Redshift Spectrum . 5. Strong Understanding of Glue Concepts , Glue Data catalogue . Experienced in Implementing END to END ETL solutions using AWS GLUE with variety of source and Target systems. 6 . Must be very strong in Pyspark. Must be able to implement all Standard and Complex ETL Transformations using Pyspark. Should be able to perform various performance Optimization techniques using Spark and Spark SQL. 7. Good knowledge of SQL is a MUST. SHould be able to implement all Standard Data Transformations using SQL. Also should be able to analyze data stored in Redshift datarwarehouse and Datalakes. 8. Must have good understanding of Athena and Redshift Spectrum. 9. Understanding of RDS 10. Understanding of DataBase Migration Service and experience in Migrating from Diverse databases. 11. Understanding of writing Lambda functions and Layers for connecting to various services.Sr. Associate 12. Understanding of Cloud Watch , Cliud Watch Events , Event bridge and also some orchestration tools in AWS like Step Functions and Apache Airflow.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies