Posted:1 month ago|
Platform:
Work from Office
Full Time
Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .
Auriga It
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru
12.0 - 22.0 Lacs P.A.
Bengaluru
4.17 - 6.9 Lacs P.A.
Bengaluru
Experience: Not specified
Salary: Not disclosed
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
Chennai, Bengaluru, Mumbai (All Areas)
50.0 - 90.0 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
Kolkata, West Bengal, India
Experience: Not specified
Salary: Not disclosed
Hyderabad, Telangana, India
Salary: Not disclosed
Bengaluru
20.0 - 25.0 Lacs P.A.
Kolkata, West Bengal, India
Experience: Not specified
Salary: Not disclosed