5523 Bigquery Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

100 Lacs

hyderabad, chennai, bengaluru

Work from Office

Role & responsibilities Experience with Google Cloud Platform , Data Integration, orchestration mechanism, ability to design BI solutions using Cloud Store, Big Query, Cloud SQL and Big Table Programming experience in PL/SQL and Python, Provide resolutions for many critical issues Worked in end-to-end Development projects. Exposure in Airflow, Composer, BigQuery, APIs, Cloud Functions and GitHub Strong analytical skills to comprehend business requirements using Python

Posted 15 hours ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, all india

On-site

As a Data Visualization Specialist, your primary responsibility will be to create detailed dashboards and interactive visual reports using Looker Studio. You will play a crucial role in enhancing data visualization, transparency, and driving insightful business strategies by collaborating with various business units. Your proficiency in SQL and data modeling will be essential in translating business requirements into effective dashboards and reports. Moreover, defining key performance indicators (KPIs) with specific objectives and benchmarking performance will also be a key aspect of your role. Your key responsibilities will include: - Creating detailed dashboards and interactive visual repo...

Posted 16 hours ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

all india, gurugram

On-site

As a Data Engineer at Infogain, your role involves analyzing existing Hadoop, Pig, and Spark scripts from Dataproc and refactoring them into Databricks-native PySpark. You will be responsible for implementing data ingestion and transformation pipelines using Delta Lake best practices, applying conversion rules and templates for automated code migration and testing, and conducting data validation between legacy and migrated environments. Additionally, you will collaborate on developing AI-driven tools for code conversion, dependency extraction, and error remediation. It is essential to ensure best practices for code versioning, error handling, and performance optimization, as well as actively...

Posted 16 hours ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

all india, gurugram

On-site

Role Overview: You have 5+ years of QA experience with a focus on automation and cloud-native environments. Your expertise lies in automation tools such as Selenium, Cypress, or Playwright. You are proficient in writing and executing complex SQL queries for data validation and have experience with ETL/data pipeline testing, GCP services, and API testing. You have scripting knowledge in Python, JavaScript, or Java for test automation and experience working in CI/CD pipelines and version control. Key Responsibilities: - Design, implement, and maintain automated test frameworks using tools such as Selenium, Cypress, Playwright, or equivalent. - Write and execute test plans and test cases for fu...

Posted 17 hours ago

AI Match Score
Apply

6.0 - 10.0 years

6 - 8 Lacs

pune, chennai, bengaluru

Hybrid

Role & responsibilities Responsible for the design, development and delivery of Talend jobs Unit testing and SIT/Preprod defect fixing Collaborate closely with stakeholders and teams to define and implement business metrics and KPIs and deliver actionable analyses. Interact with client and other externals teams for clarification and requirement gathering Requirement/dependency analysis Guiding/Mentoring new members or junior team members. Own the design, development, validation, and maintenance of ongoing metrics, dashboards, analysis and related roadmaps to drive key business decisions. Qualification : Partner with stakeholders to understand their current and future analytical goals and tra...

Posted 19 hours ago

AI Match Score
Apply

4.0 - 9.0 years

8 - 15 Lacs

hyderabad

Work from Office

Skill Set GCP Technologies- Big Query, Infoworks, GCS Bucket SQL and Datawarehousing basics responsibilities:- Development in GCP Requirement Analysis and Detailed Design, Testing and deployment Mentoring over GCP technologies

Posted 19 hours ago

AI Match Score
Apply

4.0 - 9.0 years

8 - 15 Lacs

bengaluru

Work from Office

Skill Set GCP Technologies- Big Query, Infoworks, GCS Bucket SQL and Datawarehousing basics responsibilities:- Development in GCP Requirement Analysis and Detailed Design, Testing and deployment Mentoring over GCP technologies

Posted 19 hours ago

AI Match Score
Apply

4.0 - 9.0 years

8 - 15 Lacs

pune

Work from Office

Skill Set GCP Technologies- Big Query, Infoworks, GCS Bucket SQL and Datawarehousing basics responsibilities:- Development in GCP Requirement Analysis and Detailed Design, Testing and deployment Mentoring over GCP technologies

Posted 19 hours ago

AI Match Score
Apply

4.0 - 9.0 years

8 - 15 Lacs

bangalore rural

Work from Office

Skill Set GCP Technologies- Big Query, Infoworks, GCS Bucket SQL and Datawarehousing basics responsibilities:- Development in GCP Requirement Analysis and Detailed Design, Testing and deployment Mentoring over GCP technologies

Posted 19 hours ago

AI Match Score
Apply

3.0 - 6.0 years

5 - 8 Lacs

udaipur

Work from Office

Data Engineer Name:Data Engineer Role:Data EngineerIndustry:Software/ ITLocation:Udaipur (Rajasthan)Job Type:Full TimeExperience:3- 6yearsSkills:Data Engineer,ETL/ELT processes,data ingestion, cloud platforms,big data tools,CI/CD pipelinesSalary:Best in the industryEducation:BTech (CS/IT/EC), BCA, MCA Description: 3+ years of experience in a data engineering or related role, managing complex data workflows and large-scale datasets. Expertise in SQL, with the ability to write, optimize, and troubleshoot complex queries for relational databases like MySQL, PostgreSQL, Oracle, or Snowflake. Proficiency in Python, including libraries like Pandas, PySpark, and others used for data manipulation an...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 13.0 years

15 - 25 Lacs

chennai

Work from Office

Duration: 1 year Contract. Required Skills & Experience: Hands-on experience with Google Cloud Platform (GCP). Proficient with Python, Go, or Node.js for scripting and integration. Experience working with Terraform or Deployment Manager for GCP IaC. Familiarity with containerization and deployment using Docker and Kubernetes. Understanding of basic telecom network layers (Core, Transport). Experience with network telemetry protocols (SNMP, gNMI, NETCONF/YANG). Knowledge of TM Forum Open APIs or other network management interfaces. Experience setting up monitoring and alerting using Prometheus, Grafana, or GCP Cloud Monitoring. Ability to implement automated workflows using event-driven archi...

Posted 20 hours ago

AI Match Score
Apply

6.0 - 11.0 years

20 - 35 Lacs

pune, chennai

Work from Office

Job Details: Role: Senior Agentic AI Engineer Work Location: Chennai/Pune Job Type: Work from Office Department: Technology Responsibilities: Design, develop, and implement agentic AI applications using Python and relevant frameworks. Architect and build robust and scalable backend systems. 2025 XBP Global Privileged & Confidential Page | 3 Integrate and manage interactions with various Large Language Models (LLMs) through APIs and other interfaces. Utilize and optimize vector databases for efficient information retrieval and knowledge management within AI agents. Develop and implement sophisticated agentic workflows using frameworks like Autogen, AG2, Lang graph, Agno, or CrewAI. Collaborat...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 10.0 years

11 - 15 Lacs

pune

Work from Office

Job Title: SR DATA ENGINEER Company Name: Kohler Co. Job Description: As a Senior Data Engineer at Kohler Co., you will play a critical role in designing, developing, and maintaining robust data pipelines and architectures that enable data-driven decision-making across the organization. You will collaborate with cross-functional teams to establish data standards, improve data quality, and ensure data accessibility. Your expertise in data engineering practices will help us scale our data capabilities and leverage advanced analytics. Key Responsibilities: - Design, develop, and implement high-performance data pipelines to process large volumes of data from various sources. - Collaborate with d...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 13.0 years

15 - 25 Lacs

bengaluru

Work from Office

Duration: 1 year Contract. Required Skills & Experience: Hands-on experience with Google Cloud Platform (GCP). Proficient with Python, Go, or Node.js for scripting and integration. Experience working with Terraform or Deployment Manager for GCP IaC. Familiarity with containerization and deployment using Docker and Kubernetes. Understanding of basic telecom network layers (Core, Transport). Experience with network telemetry protocols (SNMP, gNMI, NETCONF/YANG). Knowledge of TM Forum Open APIs or other network management interfaces. Experience setting up monitoring and alerting using Prometheus, Grafana, or GCP Cloud Monitoring. Ability to implement automated workflows using event-driven archi...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 13.0 years

15 - 25 Lacs

pune

Work from Office

Duration: 1 year Contract. Required Skills & Experience: Hands-on experience with Google Cloud Platform (GCP). Proficient with Python, Go, or Node.js for scripting and integration. Experience working with Terraform or Deployment Manager for GCP IaC. Familiarity with containerization and deployment using Docker and Kubernetes. Understanding of basic telecom network layers (Core, Transport). Experience with network telemetry protocols (SNMP, gNMI, NETCONF/YANG). Knowledge of TM Forum Open APIs or other network management interfaces. Experience setting up monitoring and alerting using Prometheus, Grafana, or GCP Cloud Monitoring. Ability to implement automated workflows using event-driven archi...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 10.0 years

8 - 12 Lacs

mumbai

Work from Office

Description : Industry & Sector : Technology consulting firm operating in the blockchain, fintech and distributed ledger technology (DLT) space. We deliver enterprise-grade DLT integrations, ledger analytics and data engineering solutions for global clients, transforming on-chain and off-chain data into actionable business insights. Role : Senior Data Engineer (DLT) Remote (India). Role & Responsibilities : - Design and build scalable ETL/ELT pipelines to ingest, process and transform on-chain and off-chain ledger data using streaming and batch frameworks. - Integrate with DLT nodes, RPC endpoints and indexers to capture transactions, blocks and smart-contract events into analytical stores. ...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 13.0 years

11 - 15 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities Key Responsibilities Architecture & Strategy - Define TDM architecture aligned with enterprise data governance and testing needs. - Design modular, scalable TDM frameworks using tools like Broadcom TDM, Genrocket, Delphix, and K2. - Establish synthetic data generation strategies and subsetting/masking policies. - Integrate TDM into test automation and CI/CD pipelines. Tooling & Integration - Implement data provisioning workflows across environments (ephemeral, containerized). - Automate data refresh, rollback, and versioning using TDM tools. - Enable self-service data access for testers...

Posted 20 hours ago

AI Match Score
Apply

6.0 - 8.0 years

7 - 11 Lacs

hyderabad, chennai

Work from Office

What you'll do You are a capable, self-motivated data engineer. You will be a member of the data engineering team working on tasks ranging from design, development, operations of data warehouse, data platform and data pipelines. We enjoy working closely with each other utilizing an Agile development methodology. Priorities can change quickly, but our team members are able to stay ahead to delight every one of our customers whether they are internal or external to Viasat. The day-to-day Develop ETL processes delivering high-quality code, following coding best practices. Create and implement complex analytical data models that support reporting and analytics business requirements. Support and ...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 12.0 years

4 - 8 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities Key Responsibilities Architecture & Strategy - Define TDM architecture aligned with enterprise data governance and testing needs. - Design modular, scalable TDM frameworks using tools like Broadcom TDM, Genrocket, Delphix, and K2. - Establish synthetic data generation strategies and subsetting/masking policies. - Integrate TDM into test automation and CI/CD pipelines. Tooling & Integration - Implement data provisioning workflows across environments (ephemeral, containerized). - Automate data refresh, rollback, and versioning using TDM tools. - Enable self-service data access for testers...

Posted 20 hours ago

AI Match Score
Apply

4.0 - 6.0 years

9 - 10 Lacs

pune, thiruvananthapuram

Work from Office

Job Title - Developer Work Location - Pune/Trivandrum Skill Required - Spring Boot~Core Java Experience Range - 4 to 6 Role & responsibilities - Strong Core Java Proficiency: • In-depth knowledge of java concepts like java Streams, Threads, Collections. • Advanced Proficiency in Java programming, including experience with Spring boot frameworks vs web services. • Should be aware of various java versions and key changes like java 8 Vs java 11 vs java 17 vs java 21 etc. Strong on Setting Up Java Maven projects : • Proficiency required in setting up java projects using maven or gradle and should be aware of BOM in pom.xml and should be able to handle version conflicts effectively through POM.xm...

Posted 21 hours ago

AI Match Score
Apply

6.0 - 11.0 years

17 - 30 Lacs

noida, hyderabad, thiruvananthapuram

Hybrid

The Senior GCP Data Engineer leads the design and implementation of scalable data pipelines and medallionstyle data transformations on Google Cloud Platform (GCP), including ingestion to DataHub (BigQuery) and downstream processing for Unity, TDM, Power BI, and MDM Key Responsibilities Design and implement robust ingestion pipelines (batch, streaming, pub/sub, APIdriven) from core systems (e.g., Silverlake, CIF/2020, Core Director, Symitar) into DataHub (BigQuery). Implement medallion architectures (Raw Curated Production/Gold) aligned to the Unity data model and BIAN/ISO20022 mappings. Integrate tokenization and TDM processes into data pipelines for production and nonproduction environments...

Posted 22 hours ago

AI Match Score
Apply

8.0 - 13.0 years

12 - 16 Lacs

hyderabad, pune

Work from Office

Role - GCP Data Engineer Location – Pune/Hyderabad Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS) Understanding the business requirements and providing real-time solutions Following the project, development tools like JIRA, Confluence and GIT Write python/shell scripts to automate operations and server management Build and maintain operations tools for monitoring, notifications, trending, and analysis. Define, create, test, and execute operations procedures. Document current and future configuration processes and policies To be successful in this role, you should meet the following requirements Data Engineer Developer with Cloud (GCP) Experts in Py...

Posted 22 hours ago

AI Match Score
Apply

6.0 - 10.0 years

2 - 7 Lacs

bengaluru

Work from Office

Greetings from TCS! TCS has always been in the spotlight for being adept in "the next big technologies" What we can offer you is a space to explore varied technologies and quench your techie soul. Role- GCP bigdata Engineer Location: Bangalore Experience: 6-10 Years Required Technical skills GCP bigdata with phyton and Bigquery Responsibilities : Engineers for Cornerstone to Lumi Migration InitiativeEngineers with strong SQL knowledge Experienced resources with Big data/ Hadoop implementation Well verse with GCP BigQuery implementation Minimum Qualification: 15 years of full-time education. Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable)

Posted 1 day ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 15 Lacs

hyderabad, chennai, bengaluru

Hybrid

Job description Hiring for Gcp developer Mandatory Skills: GCP, Bigquery Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional design...

Posted 1 day ago

AI Match Score
Apply

4.0 - 8.0 years

18 - 25 Lacs

bengaluru

Hybrid

Role & responsibilities Design, develop, and maintain scalable data pipelines and ETL/ELT workflows using Databricks, Google BigQuery, SQL, and cloud-native tools . Build and optimize batch and streaming data pipelines to support analytics, reporting, and business intelligence use cases. Collaborate with business stakeholders, product teams, analytics engineers, and data analysts to gather requirements and deliver data solutions. Develop and manage data models, schemas, and transformations ensuring data quality, integrity, and consistency. Optimize SQL queries, partitioning, clustering, and indexing for performance and cost efficiency. Support BI tools and dashboards by providing clean, reli...

Posted 1 day ago

AI Match Score
Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies