Posted:1 month ago|
Platform:
Work from Office
Full Time
Job Summary We are seeking a Sr. Data Engineer with 6 to 11 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Apache Airflow Python and Databricks SQL. Experience in Asset Management Operations is a plus. This is a work from office model with day shifts. Responsibilities Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing. Implement and manage workflows using Apache Airflow to automate data tasks and ensure timely execution. Write clean efficient and maintainable code in Python to support various data engineering tasks. Utilize Databricks SQL to perform complex queries and data transformations for analytics and reporting. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Conduct code reviews to ensure code quality and adherence to best practices. Troubleshoot and resolve data pipeline issues to maintain data integrity and availability. Optimize data processing performance to improve system efficiency and reduce latency. Document data workflows processes and code to ensure knowledge sharing and maintainability. Stay updated with the latest industry trends and technologies to continuously improve data engineering practices. Provide technical guidance and mentorship to junior developers to foster team growth. Participate in agile ceremonies and contribute to sprint planning retrospectives and daily stand-ups. Ensure compliance with data governance and security policies to protect sensitive information. Qualifications Must have strong experience in Spark in Scala for developing scalable data pipelines. Must have hands-on experience with Apache Airflow for workflow automation. Must be proficient in Python for data engineering tasks. Must have experience with Databricks SQL for data querying and transformation. Nice to have experience in Asset Management Operations domain. Must have excellent problem-solving skills and attention to detail. Must have strong communication and collaboration skills. Must be able to work in a hybrid work model with day shifts. Must be able to work independently and as part of a team. • Must have a proactive attitude and a willingness to learn new technologies. • Must have a strong understanding of data governance and security practices. • Must be able to document processes and code effectively. Certifications Required (Addon. Request) Certified Spark Developer Apache Airflow Certification Python Certification
Scriptbees
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Scriptbees
Bengaluru
5.0 - 9.0 Lacs P.A.
Hyderabad
10.0 - 15.0 Lacs P.A.
Hyderabad
5.0 - 9.0 Lacs P.A.
Bengaluru
6.0 - 10.0 Lacs P.A.
Hyderabad
6.0 - 10.0 Lacs P.A.
Bengaluru
25.0 - 40.0 Lacs P.A.
Bengaluru
10.0 - 14.0 Lacs P.A.
19.0 - 30.0 Lacs P.A.
Hyderābād
Salary: Not disclosed
Gurgaon
Salary: Not disclosed