#WeAreHiring Position: Senior Analytics Engineer Location: Chennai Notice Period: Immediate to 30 days About the Role We’re looking for a passionate Senior Analytics Engineer to join our team in Chennai. You’ll design and scale modern data pipelines that power business insights, working with the latest data stack to build reliable, scalable, and impactful analytics solutions. Key Responsibilities Build and optimize data pipelines and workflows Design robust data models and transformations Enable business intelligence through scalable solutions Must-Have Skills Strong proficiency in SQL & Python Experience with Cloud Data Warehouses (Snowflake / BigQuery) Expertise in DBT (data modeling & transformation) Workflow orchestration with Airflow Familiarity with BI tools (Looker, Power BI, Tableau, Domo) Nice-to-Haves 3–6 years of experience Exposure to GCP, Prefect, and Data Governance How to Apply 📧 Send your resume to: recruitment@analytixhub.ai Visit: www.analytixhub.ai Subject Line: Senior Analytics Engineer #Hiring #SeniorAnalyticsEngineer #DataEngineering #Analytics #SQL #Python #Snowflake #BigQuery #DBT #Airflow #BI #ChennaiJobs #CareerOpportunity
Job Title: Data Engineer Experience: 2–3 Years Location: Chennai Notice Period: Immediate Joiners Preferred Job Summary: We are looking for a skilled and motivated Data Engineer with hands-on experience in ETL tools, cloud technologies, and data engineering best practices. The ideal candidate will have strong expertise in SQL and Python, excellent problem-solving abilities, and effective communication skills. Key Responsibilities: Design, develop, and maintain scalable ETL pipelines using tools such as Informatica , Talend , or Pentaho . Build and optimize data solutions on Snowflake or other cloud platforms ( AWS , Azure , or GCP ). Write and optimize complex SQL queries for data extraction, transformation, and analytics. Automate data workflows and transformations using Python and scripting tools. Collaborate with cross-functional teams to ensure data accuracy, availability, and governance. Troubleshoot data pipeline and performance issues efficiently. Required Skills and Qualifications: 2–3 years of hands-on experience in Data Engineering or ETL Development . Strong expertise in any one ETL tool — Informatica, Talend, or Pentaho. Experience with Snowflake or any major cloud platform (AWS, Azure, or GCP). Proficiency in SQL and Python for data transformation and automation. Excellent communication , analytical , and critical thinking skills. Ability to work independently and deliver high-quality results under tight timelines.