Posted:3 days ago|
Platform:
Work from Office
Full Time
We are seeking a Data Engineer to build and maintain scalable data pipelines that support our data platform. This role will focus on developing ETL processes, integrating data sources, and ensuring data consistency and quality for financial analytics.
Requirements:4+ years of experience in data engineering or similar roles.
Strong experience with Snowflake, Kafka, and Debezium.Proficiency in SQL, Python, and ETL frameworks.Experience with data warehousing, data modeling, and pipeline optimization.Strong problem-solving skills and attention to detail.Experience in the financial services or fintech industry is highly desirable.
Design, implement, and maintain data pipelines that handle both batch and real-time data ingestion.
Integrate various data sources (databases, APIs, third-party data) into Snowflake and other data systems.Work closely with data scientists and analysts to ensure data availability, quality, and performance.Troubleshoot and resolve issues related to data pipeline performance, scalability, and integrity.Optimize data processes for speed, scalability, and cost efficiency.Ensure data governance and security best practices are implemented.
Globallogic
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now15.0 - 25.0 Lacs P.A.
kolkata, bengaluru
15.0 - 30.0 Lacs P.A.
17.0 - 18.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
mumbai
4.0 - 7.0 Lacs P.A.
bengaluru
5.0 - 8.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
8.0 - 10.0 Lacs P.A.
ahmedabad
3.0 - 8.0 Lacs P.A.
bengaluru
3.0 - 4.0 Lacs P.A.