4186 Bigquery Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 10 Lacs

karnataka

Work from Office

Description: Key Responsibilities: Design implement and maintain CI/CD pipelines to support automated testing building and deployment processes for applications hosted on Google Cloud Platform (GCP). Develop and manage infrastructure as code (IaC) using tools like Terraform Ansible or Google Cloud Deployment Manager. Collaborate with software engineers to improve deployment processes enhance development efficiency and resolve technical issues related to build deployment and monitoring. Implement monitoring logging and alerting solutions to ensure the health and performance of the cloud environment. Work with security teams to integrate security best practices into the CI/CD pipeline and ensu...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

uttar pradesh

Work from Office

GCP Data Engineer Primary Skill : Big Querry, Data flow, Data Composer, SQL, Python, Pyspark Proven experience as a Data Engineer with a focus on GCP services. Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. Hands-on experience with Terraform for provisioning and managing GCP infrastructure. Proficiency in Pyspark, SQL and Python for data manipulation and analysis. Solid understanding of data warehousing concepts and ETL processes. Experience with real-time data processing and streaming analytics. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

GCP Data Engineer Primary Skill : Big Querry, Data flow, Data Composer, SQL, Python, Pyspark Proven experience as a Data Engineer with a focus on GCP services. Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. Hands-on experience with Terraform for provisioning and managing GCP infrastructure. Proficiency in Pyspark, SQL and Python for data manipulation and analysis. Solid understanding of data warehousing concepts and ETL processes. Experience with real-time data processing and streaming analytics. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

telangana

Work from Office

Primary skillset is Java and secondary is API with GCP services like GKE, Bigquery, Spanner, strong in writing complex SQL's. Self stater, quick learner and problem solving skills are mandatory Secondary:C# .NET / Python, OpenShift SPSF objectives are to reduce misload through the use of RFID enabled scanners and handheld wands. It increases Package Loader productivity as manual scanning is eliminated and efficiency of Sort and Load process is increased. Customer satisfaction is improved as packages are delivered on time with the elimination of misload packages. UPS also benefits from SPSF as it helps mitigate fraud by uniquely identifying each Package Label. Mandatory Skills:Mandatory Skill...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

maharashtra

Work from Office

Description Google Logging Engineer We are looking for a GCP Engineer who has specialist skills in Google nlogging to develop, test and implement data integration, alerting and logging with the Google Cloud platform. In addition, this role will be expected to develop reporting and dashboards that illustrate activity and performance of data being ingested by GCP looker. Specific tasks includeDesign and build dashboards, reports, and alerts using Google Cloud Logging, BigQuery, and Looker based upon customer requirements. Integrate log data from various sources into BigQuery via Google Logging and ensure data compatibility. Implement performance-optimized Looker models to enable real-time and ...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

karnataka

Work from Office

Description Skills: Proficiency in SQL is a must. PL/SQL to understand integration SP part. Experience in PostgreSQL is must. Basic knowledge of Google Cloud Composer ( or Apache Airflow). Composer is managed GCP service for Apache Airflow. All pipelines are orchestrated and scheduled through Composer GCP basics-high level understanding of using GCP UI and services like Cloud SQL PostgreSQL Cloud Composer Cloud Storage Dataproc Airlfow DAGs are written in Python basic knowledge of Python code for DAGs Dataproc is Managed Spark in GCP so a bit of PySpark knowledge is also nice to have. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Def...

Posted 4 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

37 - 40 Lacs

jaipur

Work from Office

Job Description: Job Title: Data Steward - Global Procurement, VP Location: Jaipur, India Corporate Title: VP Role Description The role is supporting the Third Party Contracts (TPC) Centre of Excellence with responsibilities also within the Data and Analytics team. The roles will predominantly focus on: Creating data mining architectures/models/protocols, statistical reports, and data analysis methodologies to identify trends in large data sets Researching and applying knowledge of existing and emerging data science principles, theories, and techniques to inform business decisions As Data Steward within the TPC Centre of Excellence Team, they will ensure the quality, accuracy, and security o...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

30 - 35 Lacs

pune

Work from Office

Job Description: Job Title: Database Engineer, AVP Location: Pune, India Role Description As a Database Specialist / Engineer you will be part of the development team and work closely together with production and operation units. You bring database skills to enforce the development team within a Squad. You will extensively make use and apply Continuous Integration tools in the context of Deutsche Banks digitalization journey. You will have to ensure the maintenance of the Oracle Database and the changes needed in the Database for the application. Also support teams in the database migration activities. Engineer is responsible for managing or performing work across multiple areas of the bank'...

Posted 4 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 15 Lacs

pune

Work from Office

Job Description: Job Title - GCP - Senior Engineer - PD Location - Pune Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very mot...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

27 - 32 Lacs

bengaluru

Work from Office

About The Role : Job TitleBusiness Functional Analyst LocationBangalore, India Corporate TitleAVP Role Description Within the Securities Services division, the Fund Services product family has new opportunities for suitable candidates to be part of their data and digital transformation program. Business Functional Analyst will help us build and maintain cloud-based digital data analytics solutions including microservices. You will be responsible for designing, developing, testing and deploying web services and APIs; working in close collaboration with business, product and operations in an Agile culture. Successful candidates are expected to be experts in their fields and hands-on, and passi...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

About The Role : Job TitleBusiness Functional Analyst LocationBangalore, India Corporate TitleAS Role Description Within the Securities Services division, the Fund Services product family has new opportunities for suitable candidates to be part of their data and digital transformation program. Business Functional Analyst will help us build and maintain cloud-based digital data analytics solutions including microservices. You will be responsible for designing, developing, testing and deploying web services and APIs; working in close collaboration with business, product and operations in an Agile culture. Successful candidates are expected to be experts in their fields and hands-on, and passio...

Posted 4 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

9 - 19 Lacs

gurugram

Work from Office

We are looking for GCP Big Data Engineer who has skill set with Big Query and Airflow

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

22 - 30 Lacs

hyderabad, pune, bengaluru

Hybrid

WORK YOULL DO As a Data Engineer within Deloitte’s Customer Strategy & Design (CS&D) practice, you will design, build, and optimize large-scale data ingestion, transformation, and orchestration systems that support analytics, AI, and digital transformation initiatives. You will play a critical role in building resilient data ecosystems, driving automation, and enabling actionable insights for global clients. 1. Data Pipeline Development & Architecture Design, build, and optimize ETL/ELT pipelines using Python/PySpark/SQL on distributed data platforms. Engineer streaming and batch workflows using Azure Data Factory/Databricks/Apache Spark/Snowflake. Implement ingestion frameworks for structur...

Posted 4 weeks ago

AI Match Score
Apply

8.0 - 13.0 years

13 - 17 Lacs

pune

Work from Office

We are seeking a highly skilled Cloud Engineer with expertise in both Amazon Web Services (AWS) and Google Cloud Platform (GCP) . The ideal candidate will be responsible for designing, implementing, and managing secure, scalable, and cost-efficient cloud solutions to support our enterprise applications and infrastructure. Key Responsibilities: Design, deploy, and manage cloud-native solutions across AWS and GCP environments. Build and optimize infrastructure as code (IaC) using tools such as Terraform / CloudFormation / Deployment Manager . Configure and manage networking, VPCs, load balancers, VPNs, firewalls, IAM, and hybrid connectivity across AWS and GCP. Implement monitoring, logging, a...

Posted 4 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

0 - 1 Lacs

bengaluru

Remote

Working knowledge of GCP & Big Query. Proficiency in tools like Excel, statistical software tools & basic knowledge of R or Python. Ability to prepare and conduct user training and solution handover sessions. Basic SQL scripting & query optimization

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 9.0 years

8 - 12 Lacs

bengaluru

Work from Office

API Engineer ply Reporting Segment: Technology Solutions Group Job Description: CoreLogic is seeking a API Engineer who is passionate about all things data and has the desire to turn API data migration into a competitive advantage. Employees in this role need to be well versed with Google, GCP and APIs which can include data management, data quality and data migrations. If you like being at the forefront and for your work to be impactful, this is the position you want. This is a visible role with direct business impact as data is our product. This role also provides a tremendous opportunity to learn while making an impactful contribution to the success of the company. Job Responsibilities: M...

Posted 4 weeks ago

AI Match Score
Apply

3.0 - 5.0 years

7 - 11 Lacs

bengaluru

Work from Office

2 Years Data Engineering Experience, 3 years of consulting experience Completed Data Engineering Associate certification & required classes Minimum 1 project delivered with hands-on experience in development on Databricks Completed Apache Spark Programming with Databricks, Data Engineering with Databricks, Optimizing Apache Spark on Databricks SQL delivery experience, and familiarity with Bigquery, Synapse or Redshift Proficient in Python, knowledge of additional databricks programming languages (Scala)

Posted 4 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Reporting Specialist at Randstad, you will play a crucial role in creating comprehensive market reports, conducting global team and leadership reports, visualizing data effectively, collecting and analyzing data from various sources, and collaborating with teams across the globe. Your attention to detail, analytical skills, and ability to translate complex data into actionable insights will be key to your success in this role. **Key Responsibilities:** - Develop comprehensive market reports based on given templates for different programs to report performance, KPIs achievement, and trends. - Collect data related to the performance and engagement of global programs to identify key trends...

Posted 4 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AI/ML Engineer, you will play a pivotal role in identifying, defining, and delivering AI/ML and GenAI use cases in collaboration with both business and technical stakeholders. Your responsibilities will include designing, developing, and deploying models using Google Cloud's Vertex AI platform. You will be entrusted with fine-tuning and evaluating Large Language Models (LLMs) for domain-specific applications, while ensuring responsible AI practices and governance in solution delivery. - Collaborate closely with data engineers and architects to establish robust and scalable pipelines. - Thoroughly document workflows and experiments to ensure reproducibility and smooth handover readiness...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 9.0 years

27 - 42 Lacs

bengaluru

Work from Office

Job Description Role: Informatica IDMC Experience: 6 to 9 Years Summary : We are looking for an experienced Informatica IDMC Developer to design, develop, and maintain cloud-based data integration and ETL solutions. The ideal candidate will have hands-on expertise in Informatica Intelligent Data Management Cloud (IDMC), including Cloud Data Integration (CDI), Cloud Application Integration (CAI), and Cloud Data Quality (CDQ), along with a solid background in data warehousing and cloud platforms (AWS / Azure / GCP). Key Responsibilities: • Design, develop, and maintain ETL/ELT workflows using Informatica IDMC components (CDI, CAI, CDQ). • Integrate data from various on-premises and cloud data ...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

5 - 12 Lacs

hyderabad, chennai, bengaluru

Hybrid

Job description Hiring for GCP Developer Mandatory Skills: GCP, Bigquery Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional design...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

5 - 12 Lacs

bhubaneswar, pune, delhi / ncr

Hybrid

Job description Hiring for GCP Developer Mandatory Skills: GCP, Bigquery Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional design...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

5 - 12 Lacs

mangaluru, mysuru, coimbatore

Hybrid

Job description Hiring for GCP Developer Mandatory Skills: GCP, Bigquery Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional design...

Posted 4 weeks ago

AI Match Score
Apply

2.0 - 7.0 years

5 - 12 Lacs

hyderabad, chennai, bengaluru

Hybrid

Job description Hiring for GCP Developer Mandatory Skills: GCP, Bigquery Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional design...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 9.0 years

27 - 42 Lacs

chennai

Work from Office

Job Description Role: Looker + GCP Experience: 6 to 9 years Summary : We are seeking a skilled Looker Developer with strong experience in Google Cloud Platform (GCP) to design, build, and optimize scalable data models, dashboards, and analytics solutions. The ideal candidate should be proficient in LookML, data visualization, and GCP data services such as BigQuery, Cloud Storage, and Dataflow. Key Responsibilities: • Develop and maintain Looker dashboards, Looks, and Explores to provide business insights. • Create and optimize LookML models, views, and derived tables. • Collaborate with business and data engineering teams to understand reporting needs and translate them into scalable BI solu...

Posted 4 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies