Software Engineer (Python, SQL, ETL)

3 - 7 years

5.0 - 9.0 Lacs P.A.

hyderabad

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

automationwealth managementdata modelingproject managementanalyticaldata processingasset managementapachefinancial servicessql

Work Mode

Work from Office

Job Type

Full Time

Job Description

Delivery of the core data warehousing infrastructure and data processing pipelines. Assist in a firmwide effort to update, streamline, and optimize current dataflows and reporting capabilities. Take ownership of assigned tasks and support the development of an automated, dynamic, and self-service reporting platform. Assist in the implementation of new platforms, processes, and reporting. Organize and participate in creative design sessions. Gather and track business requirements for initiatives. Analyze output to ensure data integrity and accuracy. Key Responsibilities: ETL & Data Pipeline Development: Design, implement, and optimize end-to-end data pipelines and ETL processes using Python and SQL. Transform, cleanse, and load data into the data warehouse, ensuring data quality, consistency, and reliability throughout the pipeline. Data Warehouse Expansion: Contribute to the growth of the data warehouse by integrating new data sources and applying best practices for schema design, data modeling, and architecture. SQL & Python Development: Write complex SQL queries, stored procedures, and automation scripts in Python to process and transform large datasets. Automation with Apache Airflow: Leverage Apache Airflow to automate, schedule, and monitor ETL workflows, ensuring reliable and timely data processing. Actively monitor job execution, troubleshoot failed tasks, and implement retry mechanisms or alerting to ensure smooth and uninterrupted data pipeline operations. Collaboration & Communication: Work closely with cross-functional teams to define data requirements, troubleshoot issues, and deliver actionable insights. Proactive Problem Solving: Identify and resolve data engineering challenges in a proactive manner, driving continuous improvements in data workflows and systems. What ideal qualifications, skills & experience would help someone to be successful? Required Qualifications: A Bachelor s degree (or higher) in Computer Science, Software Engineering, or a related field from a reputable institution. 4+ years of hands-on experience as a Data Engineer or SQL/Python Developer, with a focus on building and optimizing data pipelines and ETL processes. Strong proficiency in Python for developing and automating ETL workflows, data processing, and pipeline management. Advanced knowledge of SQL, including stored procedures, CTEs, and subqueries for efficient data transformation and querying. Experience in data pipeline architecture and managing end-to-end ETL workflows from ingestion to transformation and loading. Self-starter with the ability to work independently and collaborate in a team environment. Strong problem-solving skills, with a proactive approach to identifying and addressing data issues. Excellent communication skills, both written and verbal, for working with stakeholders across technical and non-technical teams. Desirable Qualifications: Experience with cloud technologies, particularly AWS, and an understanding of cloud-based data solutions. Solid understanding of Apache Airflow for automating and orchestrating data workflows. Project management experience, with the ability to drive projects and manage timelines effectively.