Python Developer (ETL Pandas, SQL, Data Pipelines & Automation)

3 - 8 years

12 - 16 Lacs

Posted:9 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary

Synechron is seeking a detail-oriented and analytical Python Developer to join our data team. In this role, you will design, develop, and optimize data pipelines, analysis tools, and workflows that support key business and analytical functions. Your expertise in data manipulation, database management, and scripting will enable the organization to enhance data accuracy, efficiency, and insights. This position offers an opportunity to work closely with data analysts and scientists to build scalable, reliable data solutions that contribute directly to business decision-making and operational excellence.

Software Requirements

Required Skills:

  • Python (version 3.7 or higher) with experience in data processing and scripting
  • Pandas library (experience in large dataset manipulation and analysis)
  • SQL (proficiency in writing performant queries for data extraction and database management)
  • Data management tools and databases such as MySQL, PostgreSQL, or similar relational databases

Preferred Skills:

  • Experience with cloud data services (AWS RDS, Azure SQL, GCP Cloud SQL)
  • Knowledge of additional Python libraries such as NumPy, Matplotlib, or Jupyter Notebooks for data analysis and visualization
  • Data pipeline orchestration tools (e.g., Apache Airflow)
  • Version control tools like Git
Overall Responsibilities
  • Develop, test, and maintain Python scripts for ETL processes and data workflows
  • Utilize Pandas to clean, analyze, and transform large datasets efficiently
  • Write, optimize, and troubleshoot SQL queries for data extraction, updates, and management
  • Collaborate with data analysts and scientists to create data-driven analytic tools and solutions
  • Automate repetitive data workflows to increase operational efficiency and reduce errors
  • Maintain detailed documentation of data processes, pipelines, and procedures
  • Troubleshoot data discrepancies, pipeline failures, and database-related issues efficiently
  • Support ongoing data quality initiatives by identifying and resolving data inconsistencies
Technical Skills (By Category)

Programming Languages:

  • Required: Python (3.7+), proficiency with data manipulation and scripting
  • Preferred: Additional scripting languages such as R or familiarity with other programming environments

Databases/Data Management:

  • Relational databases: MySQL, PostgreSQL, or similar
  • Experience with query optimization and database schema design

Cloud Technologies:

  • Preferred: Basic experience with cloud data services (AWS, Azure, GCP) for data storage and processing

Frameworks and Libraries:

  • Pandas, NumPy, Matplotlib, Jupyter Notebooks for data analysis and visualization
  • Airflow or similar orchestration tools (preferred)

Development Tools and Methodologies:

  • Git or similar version control tools
  • Agile development practices and collaborative workflows

Security Protocols:

  • Understanding of data privacy, confidentiality, and secure coding practices
Experience Requirements
  • 3+ years of experience in Python development with a focus on data processing and management
  • Proven hands-on experience in building and supporting ETL workflows and data pipelines
  • Strong experience working with SQL and relational databases
  • Demonstrated ability to analyze and manipulate large datasets efficiently
  • Familiarity with cloud data services is advantageous but not mandatory
Day-to-Day Activities
  • Write and enhance Python scripts to perform ETL, data transformation, and automation tasks
  • Design and optimize SQL queries for data extraction and updates
  • Collaborate with data analysts, scientists, and team members during daily stand-ups and planning sessions
  • Investigate and resolve data quality issues or pipeline failures promptly
  • Document data pipelines, workflows, and processes for clarity and future maintenance
  • Assist in developing analytical tools and dashboards for business insights
  • Review code changes through peer reviews and ensure adherence to best practices
  • Participate in continuous improvement initiatives related to data workflows and processing techniques
Qualifications
  • Bachelors degree in Computer Science, Data Science, Information Technology, or a related field
  • Relevant certifications or training in Python, data engineering, or database management are a plus
  • Proven track record of working on data pipelines, analysis, and automation projects
Professional Competencies
  • Strong analytical and problem-solving skills with attention to detail
  • Effective communication skills, able to collaborate across teams and explain technical concepts clearly
  • Ability to work independently and prioritize tasks effectively
  • Continuous learner, eager to adopt new tools, techniques, and best practices in data processing
  • Adaptability to changing project requirements and proactive in identifying process improvements
  • Focused on delivering high-quality work with a results-oriented approach

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Synechron logo
Synechron

Information Technology and Services

New York

RecommendedJobs for You