Home
Jobs

2 Pub Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 13 years

10 - 15 Lacs

Jaipur, Rajasthan

Work from Office

Naukri logo

Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .

Posted 1 month ago

Apply

7 - 11 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Role Description: The DB Cloud FinOps function drives financial accountability of cloud consumption, providing distributed teams with insights into their consumption, spend and optimisation / control options to ensure cloud usage is managed efficiently. We are seeking an individual with a strong background in GCP services, FinOps and IT Application management. The applicant will be the IT Owner of the FinOps Application platform on our Google Cloud Platform (GCP) environment. The FinOps Application consists of our GCP landing zone (GCP projects, resources datasets) and our FinOps Looker instance. This together with our teams skillset is the strategic cloud cost management function for the bank. The role covers E2E platform management, which includes creation and maintenance of BigQuery datasets and views, administration of billing exports, deployment of Cloudfunctions, API integrations and compliance adherence. Your key responsibilities Ensure the FinOps application is maintained in accordance with the Banks IT Security Risk, Audit and Compliance requirements. Provisioning and maintenance of GCP billing, recommender and database services to support FinOps internal and external capabilities (e.g. BigQuery, GCS buckets and billing exports) Provisioning of serverless GCP services to support the automation of key FinOps insights and optimization / recommendation capabilities (i.e, Pub/Sub topics, Cloud Run, API queries and Cloud Functions) Management of FinOps GCP IAM roles, permissions and Github repositories. Support with the integration of our FinOps platform into other tools used withing the bank (e.g. Jira, Looker) Your skills and experience Infrastructure & Cloud technology industry experience (7+ Years) Strong understanding of Software Development Lifecycle methodology Proficient with Terraform, Python and Github (3+ Years) Proficient in GCP infrastructure and GCP services Proficient in data analysis tools (e.g. Excel, PowerQuery, SQL) Strong analytical and problem solving skills Experience in FinOps preferred

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies