We are hiring Python Developers skilled in Django/Flask, Pandas/Numpy, SQL & REST APIs. AWS, Docker & Jenkins (CI/CD) is a plus. Apply now by filling your details here: https://docs.google.com/forms/d/1IVbOZwSyvlLHXp7Qqs6Xw8oifq5HhxEtrijLSO6Sy-I/edit
We have 55 openings for this role! Please fill out the form below at your earliest convenience so our HR team can contact you for the next steps. G-Form: https://docs.google.com/forms/d/1IVbOZwSyvlLHXp7Qqs6Xw8oifq5HhxEtrijLSO6Sy-I/edit
Responsibilities: * Design, develop, and maintain data pipelines using Python, Pandas, Numpy, Django, Power BI/Tableau. * Optimize database performance with PostgreSQL, PySpark, Snowflake. Work from home
Responsibilities: * Design, develop & maintain data pipelines using Snowflake, Python, SQL & AWS. * Optimize performance & ensure data accuracy through ETL processes.
1.Data Pipeline Devlopment and maintainance 2.Database and Data Warehouse Management 3.data Transformation and Integration https://docs.google.com/spreadsheets/d/1dTIagGAwtsK_iZwlVS1fN3d_ZXfYqyFgjuOba1hhUtc/edit?gid=1273277659#gid=1273277659 Required Candidate profile https://docs.google.com/spreadsheets/d/1dTIagGAwtsK_iZwlVS1fN3d_ZXfYqyFgjuOba1hhUtc/edit?gid=1273277659#gid=1273277659
Seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and warehouses. Strong in SQL, Python, ETL, and cloud platforms (AWS/Azure/GCP). Experience with Spark or Airflow preferred. Required Candidate profile Candidate should have strong skills in SQL, Python, ETL, and cloud data platforms. Must handle big data tools, pipelines, and ensure data quality and performance optimization.
 
                         
                    