4 - 8 years
6 - 10 Lacs
Posted:1 week ago|
Platform:
Work from Office
Full Time
Senior Data Migration Engineer About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients business goals. Job Summary: We are seeking a skilled Senior Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime. Job Responsibilities Software Development: Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Implement efficient and maintainable code using best practices and coding standards. AWS & Databricks Implementation: Work with Databricks platform for big data processing and analytics. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines. Leverage PySpark for large-scale data processing and transformation tasks. Continuous Learning: Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. Share knowledge with the team and contribute to a culture of continuous improvement. SQL Database Management: Utilize expertise in SQL to design, optimize, and maintain relational databases. Write complex SQL queries for data retrieval, manipulation, and analysis. Qualifications & Skills: Education: Bachelor s degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus. 4 to 8 Years of experience in Databricks and big data frameworks Proficient in AWS services and data migration Experience in Unity Catalogue Familiarity with Batch and real time processing Data engineering with strong skills in Python, PySpark, SQL Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, agile environment. Career Level - IC2 Job Responsibilities Software Development: Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Implement efficient and maintainable code using best practices and coding standards. AWS & Databricks Implementation: Work with Databricks platform for big data processing and analytics. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines. Leverage PySpark for large-scale data processing and transformation tasks. Continuous Learning: Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. Share knowledge with the team and contribute to a culture of continuous improvement. SQL Database Management: Utilize expertise in SQL to design, optimize, and maintain relational databases. Write complex SQL queries for data retrieval, manipulation, and analysis.
Oracle
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Oracle
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
6.0 - 10.0 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed
Noida, Uttar Pradesh, India
Salary: Not disclosed
Ahmedabad, Gujarat, India
Salary: Not disclosed
Hyderabad, Pune
0.5 - 2.5 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
10.0 - 11.0 Lacs P.A.
Chennai
7.0 - 10.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
1.0 - 2.0 Lacs P.A.
14.0 - 20.0 Lacs P.A.