Hyderabad
INR 20.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Job Summary We are seeking a Sr. Data Engineer with 6 to 11 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Apache Airflow Python and Databricks SQL. Experience in Asset Management Operations is a plus. This is a work from office model with day shifts. Responsibilities Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing. Implement and manage workflows using Apache Airflow to automate data tasks and ensure timely execution. Write clean efficient and maintainable code in Python to support various data engineering tasks. Utilize Databricks SQL to perform complex queries and data transformations for analytics and reporting. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Conduct code reviews to ensure code quality and adherence to best practices. Troubleshoot and resolve data pipeline issues to maintain data integrity and availability. Optimize data processing performance to improve system efficiency and reduce latency. Document data workflows processes and code to ensure knowledge sharing and maintainability. Stay updated with the latest industry trends and technologies to continuously improve data engineering practices. Provide technical guidance and mentorship to junior developers to foster team growth. Participate in agile ceremonies and contribute to sprint planning retrospectives and daily stand-ups. Ensure compliance with data governance and security policies to protect sensitive information. Qualifications Must have strong experience in Spark in Scala for developing scalable data pipelines. Must have hands-on experience with Apache Airflow for workflow automation. Must be proficient in Python for data engineering tasks. Must have experience with Databricks SQL for data querying and transformation. Nice to have experience in Asset Management Operations domain. Must have excellent problem-solving skills and attention to detail. Must have strong communication and collaboration skills. Must be able to work in a hybrid work model with day shifts. Must be able to work independently and as part of a team. • Must have a proactive attitude and a willingness to learn new technologies. • Must have a strong understanding of data governance and security practices. • Must be able to document processes and code effectively. Certifications Required (Addon. Request) Certified Spark Developer Apache Airflow Certification Python Certification
Hyderabad, Chennai, Bengaluru
INR 10.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Job Description: Java Developer Location-Hyderabad Experience - 2 to 10 years Salary Ranges from - 7 lacs PA - 33 lac PA We are hiring a Java Developer with hands-on experience in Apache Kafka and SQL to build real-time data processing solutions. Key Responsibilities: • Develop microservices and streaming applications using Java and Kafka • Design and implement data pipelines for real-time and batch processing • Write and optimize SQL queries for data access and transformation • Ensure scalability, reliability, and performance of data systems • Work closely with data engineers and architects Skills Required: • Strong programming skills in Java and Selenium Testing. • Experience with Kafka (producer, consumer, stream processing) • Proficiency in SQL and working with relational databases • Knowledge of event-driven architecture and REST APIs
My Connections Scriptbees
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.