70 Dataform Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: You are sought after for the position of Senior BigQuery Developer at FIS Clouds in Hyderabad. Your role will involve designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Collaborating closely with data scientists, analysts, and business stakeholders, you will ensure efficient and secure access to enterprise data. Key Responsibilities: - Design, develop, and optimize BigQuery data warehouses and data marts to support analytical and business intelligence workloads. - Implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. - Integrate Bi...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Role Overview: Join an exciting initiative with our customer's Marketing Performance Team, where your engineering expertise will play a key role in building a next-generation Marketing Analytics Platform. This ecosystem will consolidate marketing data across various formats to drive intelligent decision-making and deliver AI-powered insights. We are seeking a seasoned AdTech Engineer with strong data infrastructure experience and a passion for performance-driven marketing technology. Key Responsibilities: - Data Integration & Transformation: Design and build robust ETL/ELT pipelines to unify data across disparate AdTech platforms and marketing tools. - Platform Development: Contribute to bui...

Posted 1 month ago

AI Match Score
Apply

6.0 - 8.0 years

0 Lacs

pune, maharashtra, india

On-site

Role Title: Data Analyst Function: Data & Analytics Role Type: Permanent About this Role As a Data Analyst within the Data & Analytics (D&A) team, you will play a pivotal role in developing and maintaining the Application Layer that interfaces directly with our D&A customers. This role requires a strong understanding of business context and the ability to translate evolving requirements into robust data solutions. You will be responsible for backend development, including the creation of data structures, data marts, and transformation pipelines that support regular service operations. The role also involves developing ETLs from source systems into the Data Warehouse and ensuring data consist...

Posted 1 month ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

pune, maharashtra, india

On-site

At Swarovski, where innovation meets inspiration, our people desire to explore, experience and create. We are looking for a Data Engineer where you will get a chance to work in a rewarding role within a diverse team that is pushing boundaries. Be part of a truly iconic global brand, learn and grow with us. We're bold and inventive, revealing astonishing things like no one else can. A world of wonder awaits you. About The Job Maintain and support data transformation pipelines from the Landing Zone to the Consumption Layer. Monitor the health and performance of data pipelines, infrastructure, and associated systems within GCP and SAP BW environments, ensuring seamless and efficient operations....

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

0 - 0 Lacs

hyderabad, pune, bengaluru

Hybrid

Location- Chennai, Bangalore, hyd , Pune Dev-Ops engineer with exp in Big Query, Data form, Qlik Replicate or any other GCP-native tools. Role & responsibilities Design, build and maintain ETL/ELT pipelines using GCP services including Big Query, Data form, Qlik Replicate or any other GCP-native tools. Integrate DevSecOps best practices into CI/CD pipelines Knowledge in creating CDC jobs in ETL tools. Experience with DevOps toolchains and integrating security into CI/CD pipelines Expertise or Proficient in writing SQL queries. Good to have knowledge on implementing batch and streaming data pipelines using GCP Dataflow. Schedule and automate data workflows and jobs using Google Cloud Schedule...

Posted 2 months ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

pune, maharashtra, india

On-site

Data Engineer At Swarovski, where innovation meets inspiration, our people desire to explore, experience and create. We are looking for a Data Engineer where you will get a chance to work in a rewarding role within a diverse team that is pushing boundaries. Be part of a truly iconic global brand, learn and grow with us. We're bold and inventive, revealing astonishing things like no one else can. A world of wonder awaits you. About The Job Maintain and support data transformation pipelines from the Landing Zone to the Consumption Layer. Monitor the health and performance of data pipelines, infrastructure, and associated systems within GCP and SAP BW environments, ensuring seamless and efficie...

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

0 Lacs

pune, maharashtra, india

On-site

Who We Are VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value for customers by delivering intelligent solutions through Talent, Technology & Transformation. As the largest shared services organisation in the global telco industry with 30,000 FTE, our portfolio of next-generation solutions and services are designed in partnership with customers across Vodafone Group, local markets, and partner markets to simplify and drive growth. With our strategic partner Accenture, we work alongside our Vodafone customers, other Telco and tech companies to drive transformation, meet the challenges of our industry and ensure we stay relevant and resilient. This pa...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

0 - 0 Lacs

hyderabad, pune, bengaluru

Hybrid

Location- Chennai, Bangalore, hyd , Pune Dev-Ops engineer with exp in Big Query, Data form, Qlik Replicate or any other GCP-native tools. Role & responsibilities Design, build and maintain ETL/ELT pipelines using GCP services including Big Query, Data form, Qlik Replicate or any other GCP-native tools. Integrate DevSecOps best practices into CI/CD pipelines Knowledge in creating CDC jobs in ETL tools. Experience with DevOps toolchains and integrating security into CI/CD pipelines Expertise or Proficient in writing SQL queries. Good to have knowledge on implementing batch and streaming data pipelines using GCP Dataflow. Schedule and automate data workflows and jobs using Google Cloud Schedule...

Posted 2 months ago

AI Match Score
Apply

0.0 years

0 Lacs

gurugram, haryana, india

On-site

Context on the BI Team.. The BI Team are currently in the process of migrating their existing reporting and data tools from Snowflake / Tableau over to Google Cloud Platform (Bigquery / Looker). The wider central data team (Data Engineering, Data Governance, Data Science, Data Analytics) are working to build a Medallion data architecture and the BI Team will be responsible for building and maintaining the Gold Layer and BI Self Serve Tools. What we're looking for... Looker Understanding of the Looker Semantic Layer and LookML Experience building Models, Explores and Dashboards in Looker Experience creating Derived Tables within Looker / BigQuery Experience / Knowledge of Hub and Spoke Archit...

Posted 2 months ago

AI Match Score
Apply

15.0 - 20.0 years

30 - 35 Lacs

bengaluru

Work from Office

Job Summary Were looking for a Principal Cloud Engineer with a strong foundation in Multi-Cloud & multi region deployment, data architecture, distributed systems, and modern cloud-native platforms to architect, build, and maintain intelligent infrastructure and systems that power our AI, GenAI and data-intensive workloads. Youll work closely with cross-functional teams, including data scientists, ML & software engineers, and product managers & play a key role in designing a highly scalable platform to manage the lifecycle of data pipelines, APIs, real-time streaming, and agentic GenAI workflows, while enabling federated data architectures. The ideal candidate will have a strong background in...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

19 - 22 Lacs

bengaluru

Work from Office

Job Summary As a key member of the Data Team at Equinix, we are seeking an experienced GCP Data Engineer who will lead end-to-end development of complex Data Engineering use cases and drive the evolution of Equinix's Data Lake platform You will design and build enterprise-scale data infrastructure and analytics solutions on Google Cloud Platform while providing technical mentorship to the data engineering team The ideal candidate combines deep technical expertise in cloud-native data technologies with proven leadership skills and a passion for building robust, scalable data platforms that drive strategic business insights Responsibilities Participate in design and implementation of enterpris...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Position Overview: We are seeking a skilled and passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and Google BigQuery. In this role, you will architect, build, and maintain the scalable data pipelines that are the foundation of our analytics and data science initiatives. You will be responsible for the entire data lifecycle, from ingestion and processing to warehousing and serving. You will work on creating a reliable, efficient, and high-quality data ecosystem that empowers our data analysts, data scientists, and business leaders to make critical data-informed decisions. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily ...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Position Overview: We are seeking a skilled and passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and Google BigQuery. In this role, you will architect, build, and maintain the scalable data pipelines that are the foundation of our analytics and data science initiatives. You will be responsible for the entire data lifecycle, from ingestion and processing to warehousing and serving. You will work on creating a reliable, efficient, and high-quality data ecosystem that empowers our data analysts, data scientists, and business leaders to make critical data-informed decisions. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily ...

Posted 2 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a highly skilled and motivated Cloud Data Engineering Manager at Merkle, your role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns on Google Cloud Platform (GCP). **Key Responsibilities:** - **Data Engineering & Development:** - Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. - Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. - Develop and optimize data architectures that support real-time and batch data process...

Posted 2 months ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: As a Full Stack Data Engineer, you will collaborate with Data Scientists and Product Development teams to create cutting-edge data products that align with Company Objectives. Your responsibilities will include landing data, developing new data products, enhancing existing solutions, and validating solutions with Analytics & Business partners for production release. Key Responsibilities: - Work with Data Scientists and Product Development teams to develop industry-leading data products - Land data and develop new data products - Enhance existing data products - Collaborate with Analytics & Business partners to validate solutions for production release - Utilize GCP services su...

Posted 2 months ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

pune, maharashtra, india

On-site

At Swarovski, where innovation meets inspiration, our people desire to explore, experience and create. We are looking for a Data Engineer where you will get a chance to work in a rewarding role within a diverse team that is pushing boundaries. Be part of a truly iconic global brand, learn and grow with us. We're bold and inventive, revealing astonishing things like no one else can. A world of wonder awaits you. About The Job Maintain and support data transformation pipelines from the Landing Zone to the Consumption Layer. Monitor the health and performance of data pipelines, infrastructure, and associated systems within GCP and SAP BW environments, ensuring seamless and efficient operations....

Posted 2 months ago

AI Match Score
Apply

3.0 - 12.0 years

0 Lacs

chandigarh

On-site

As a Data Engineer with over 12 years of experience, you will be responsible for leading Data Engineer teams in developing enterprise-grade data processing pipelines on Google Cloud. Your role will involve leading projects involving the migration of ETL pipelines and Data warehouses to the cloud, with a focus on medium to high complexity projects. You should have a minimum of 3 to 4 years of experience in this capacity and at least 3 to 5 years with premium consulting companies. Your expertise should include hands-on experience with Google Cloud Platform services such as BigQuery, Dataform, Dataplex, etc. Additionally, you should possess exceptional communication skills to effectively engage...

Posted 2 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: You will be responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing, and analyzing large volumes of data efficiently and accurately. Collaborating with business and technology stakeholders to understand current and future data requirements will be a key aspect of your role. Key Responsibilities: - Collaborate with business and technology stakeholders to understand current and future data requirements - Design, build, and maintain reliable, efficient, and scalable data infrastructure for data collection, storage, transformation, and analysis - Plan, design, build, and maintain scalabl...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 15 Lacs

hyderabad, chennai, bengaluru

Hybrid

GCP Dataflow, GCP Cloud Composer, GCP BigQuery, GCP Cloud Storage, Dataproc. Java, Python, Scala. ETL/ELT, Big Data Hadoop Ecosystem, ANSI-SQL. DevOps, CI/CD, API, Agile GCP Datastream, Dataform, Datafusion, Workflows, Pub/Sub, and DMS

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

12 - 16 Lacs

pune

Work from Office

Experience Level: 46 years Note - Video of self Introduction of 3mins. Key Responsibilities Design, build, and maintain scalable and reliable data pipelines using GCP services such as: BigQuery, Cloud Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow). Implement data ingestion from diverse sources (e.g., APIs, databases, files) into cloud data lake/warehouse. Optimize data processing workflows for performance, reliability, and cost efficiency. Develop and manage ETL/ELT processes using tools like Dataform, dbt, or custom scripts. Collaborate with data scientists, analysts, and stakeholders to deliver clean, well-documented datasets. Implement data quality, governance, and observabili...

Posted 3 months ago

AI Match Score
Apply

6.0 - 11.0 years

6 - 15 Lacs

chennai

Hybrid

Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow ...

Posted 3 months ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Full Stack Data Engineer at our company, you will collaborate with Data Scientists and Product Development business partners to create cutting-edge data products that align with our Company Objectives. Your responsibilities will include landing data, developing new data products, enhancing existing ones, and collaborating with Analytics & Business partners to validate solutions for production release. You should have a Bachelor's Degree and at least 2+ years of experience in GCP services such as Big Query, Dataproc, Data Plex, DataFusion, Terraform, Tekton, Airflow, Cloud Storage, and Pub/Sub. Proficiency in Git or any other version control tool is required. While API knowledge is consi...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Join an exciting initiative with our customer's Marketing Performance Team, where your engineering expertise will play a key role in building a next-generation Marketing Analytics Platform. This ecosystem will consolidate marketing data across various formats to drive intelligent decision-making and deliver AI-powered insights. We are seeking a seasoned AdTech Engineer with strong data infrastructure experience and a passion for performance-driven marketing technology. Key Responsibilities: - Data Integration & Transformation: Design and build robust ETL/ELT pipelines to unify data across disparate AdTech platforms and marketing tools. - Platform Development: Contribute to building a marketi...

Posted 3 months ago

AI Match Score
Apply

10.0 - 12.0 years

0 Lacs

mumbai, maharashtra, india

On-site

We are seeking a highly skilled and motivated GCP Data Architect to join our team. Google Cloud Platform (GCP) Data Architect would be responsible for designing and implementing cloud-based solutions for enterprise-level clients using GCP. The role involves understanding clients business requirements and translating them into technical solutions that meet their needs. The GCP Data Architect should have a strong understanding of cloud architecture, including compute, networking, storage, security, and automation. They should also have a deep understanding of GCP services, such as App Engine, Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud SQL, and Cloud Pub/Sub and tools suc...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are urgently required to join as a Senior BigQuery Developer (Google Cloud Platform) with a minimum experience of 5-8 years in Hyderabad. In this role, you will be responsible for designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Your primary focus will be on designing, developing, and optimizing BigQuery data warehouses and data marts to support analytical and business intelligence workloads. You will also need to implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. Integration of BigQuery with tools such as Dataform, Airflow, Cloud Compos...

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies