Home
Jobs

520 Bigquery Jobs - Page 17

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 6 years

10 - 16 Lacs

Hyderabad

Hybrid

Naukri logo

Requirements: Must have worked in a QA role for ETL/data transformation Minimum 3+ years of QA experience Strong SQL skills Proficiency in using BigQuery functions and operators for data comparison, aggregation, and validation across different stages of the transformation Should have analytical skills to understand complex requirements to do in depth data validation Should be able to work independently to create artifacts such as test strategy, test plan etc. Good understanding of data mapping and data requirements Inclined to do rigorous and repeatable testing with enthusiasm to find bugs Willing to do manual SQL QA work with complex queries Intuitive and able to work independently Strong communication skills are a must have Experience and desire to work in a Global delivery environment Location : Hyderabad Shift : 1.00 PM to 10.00 PM Notice Period : Immediate to 15 Days

Posted 1 month ago

Apply

5 - 10 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Company Overview Lifesight is a fast growing SaaS company focused on helping businesses leverage data & AI to improve customer acquisition and retention. We have a team of 130 serving 300+ customers across 5 offices in the US, Singapore, India, Australia and the UK. Our mission is to make it easy for non-technical marketers to leverage advanced data activation and marketing measurement tools that are powered by AI, to improve their performance and achieve their KPIs. Our product is being adopted rapidly globally and we need the best people onboard the team to accelerate our growth. Position Overview The ideal candidate is a self-motivated, self managed and multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and working on improving numerous non-functional requirements of the products as well. If youre looking to be a part of a dynamic, highly-analytical team and an opportunity to hone your java, cloud engineering and distributed system skills, look no further. As our Senior Software Engineer for the platform team, you will be handed the reins to build the core microservices. Along with building services to query billions of rows in Googles BigQuery, you will be in charge of building scalable APIs to build user segments and evolve architecture to send millions of notifications through varied streams like Email, SMS, in-app notifications per day. What youll do: Be responsible for the overall development of the modules/services that you will be working on Code, design, prototype, perform reviews and consult in the process of building highly scalable, reliable, and fault-tolerant systems. As our senior software engineer continuously refactor applications and architectures to maintain high-quality levels Continue to stay abreast of the latest technologies in distributed systems, caching and research new technologies, tools that enable building the next generation systems Act as an engineer that enjoys writing readable, concise, reusable, extensible code every day Discuss, articulate requirements with product management and scope, execute the feature road map Participate in teams hiring process by being a panelist in interviews Requirements What youll need: Ideally 7+ years of hands-on experience in designing, developing, testing, and deploying large scale applications, microservices in any language or stack (preferably java, Springboot) Good knowledge in one or more of these areas: Cloud, NoSQL stores; we use Google Cloud, Kubernetes, BigQuery, messaging systems Excellent attitude and passion working in a team with willingness to learn Experience in building low latency, high volume REST API requests handling Experience in working in distributed caches like Redis Ability to Get Stuff Done ! Bonus Points If you have.. Experience in containerization technologies like Docker, Kubernetes Experience working in any cloud platforms (preferably GCP) Experience in NoSQL stores (like Cassandra, Clickhouse, BigQuery) Benefits What is in it for the candidate : As a team, we are concerned with not only the growth of the company, but each other’s personal growth and well being too. Along with our desire to utilize smart technology and innovative engineering strategies to make people’s lives easier, our team also bonds over our shared love for all kinds of tea, movies & fun filled Friday’s events with a prioritizing healthy work-life balance. 1. Working for one of the fastest growing and successful MarTech companies in times 2. Opportunity to be part of an early member of the core team to build a product from scratch starting from making tech stack choices, driving and influencing the way to simplify building complex products. 3. Enjoy working in small teams and a non bureaucratic environment 4. Enjoy an environment that provides high levels of empowerment and space to achieve your Objectives and grow with organization. 5. Work in a highly profitable and growing organization, with opportunities to accelerate and shape your career 6. Great benefits - apart from competitive compensation & benefits 7. Above all - a “fun” working environment.

Posted 1 month ago

Apply

4 - 9 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

5+ experience in software development using C#, MSSQL and GCP/BigQuery. Good to have Python experience Contribute to the design and development of innovative software solutions that meet business requirements. Develop and maintain applications using specified technologies Participate in code reviews to ensure high-quality code and adherence to best practices. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience in code reviews and maintaining code quality. Ability to mentor and guide junior developers. Bachelor's degree in Computer Science, Engineering, or a related field.

Posted 1 month ago

Apply

3 - 8 years

12 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Responsibilities: Develop and maintain data pipelines using GCP. Write and optimize queries in BigQuery. Utilize Python for data processing tasks. Manage and maintain SQL Server databases. Must-Have Skills: Experience with Google Cloud Platform (GCP). Proficiency in BigQuery query writing. Strong Python programming skills. Expertise in SQL Server. Good to Have: Knowledge of MLOps practices. Experience with Vertex AI. Background in data science. Familiarity with any data visualization tool.

Posted 1 month ago

Apply

2 - 4 years

3 - 8 Lacs

Kolkata

Remote

Naukri logo

Data Quality Analyst Experience: 2 - 4 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Data Validation, BigQuery, SQL, Communication Skill Good to have skills : Data Visualisation, PowerBI, Tableau Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Short term objectives We know the importance data validation can play in creating better reporting for our business - we have identified areas we want you to make an impact within the first 3 months. Push 40% of partners through the ingestion validation process Push 40% of partners through the mapping validation process Data Team Culture Our team requires four areas of focus from every team member (see below). We use these focus areas to guide our decision making and career growth. To give you an idea of these requirements, the top three from each area are: Mastery: • Demonstrate skills expertise in relevant tool (e.g., GA, Tableau) or code language (e.g., SQL) • Think about the wider impact & value of decisions • Understand and anticipate the need for scalability, stability, and security Communication: • Provide clear, actionable feedback from peer reviews • Communicate effectively to wider teams and stakeholders • Proactively share knowledge everyday Ownership: • Lead complex initiatives that drive challenging goals • Create and push forward cross cutting concerns between teams • Demonstrate consistently sound judgement Behaviours: • Challenge yourself and others through questioning, assessing business benefits, and understanding cost of delay • Own your workload and decisions - show leadership to others • Innovate to find new solutions, or improve existing ways of working - push yourself to learn everyday Responsibilities: Reports directly to Senior Business Analyst and works closely with Data & Revenue Operations functions to support key deliverables Reconciliation of affiliate network revenue by vertical and publisher brand at monthly level Where discrepancies exist, investigation by to isolate whether specific days, products, providers, or commission values Validate new tickets going on to the Data Engineering JIRA board to ensure requests going into Data Engineering are complete, accurate and as descriptive as possible Investigation results to be updated into JIRA tickets and all outputs saved in mapping google sheet Use Postman API, Webhooks to pull revenue data from partner portals and verify against partner portals and BQ Monitor API failures, rate limits, and response inconsistencies impacting revenue ingestion. As necessary, seek revenue clarifications from the verticals RevOps team member As necessary, clarify JIRA commentary for data engineers Understand requirements, goals, priorities, and communicate to the stakeholders on progress towards data goals Ability to ensure outputs are on time and on target Required competencies: At least two (2) years of data quality analysis experience A strong understanding of SQL and how it can be used to validate data (experience with BigQuery is a plus) An understanding of large, relational databases and how to navigate these datasets to find the data required Ability to communicate data to non-technical audiences through the use of reports and visualisations Strong interpersonal and communication skills Comfortable working remotely and collaboratively with teammates across multiple geographies and time zones Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Patna

Work from Office

Naukri logo

Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations.Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

6 - 11 years

10 - 18 Lacs

Noida, Indore

Work from Office

Naukri logo

Role & responsibilities Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming Ability to write complex SQL Queries, Stored Procedures. Migration Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Preferred candidate profile Roles & Responsibilities: Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Mandatory Skills: Postgresql, plsql, Bigquery Bottom of Form

Posted 1 month ago

Apply

5 - 10 years

0 - 3 Lacs

Hyderabad

Hybrid

Naukri logo

Job Profile We are seeking a Senior Data Engineer with proven expertise in designing and maintaining scalable, efficient, and reliable data pipelines. The ideal candidate should have strong proficiency in SQL, DBT, BigQuery, Python, and Airflow, along with a solid foundation in data warehousing principles. In this role, you will be instrumental in managing and optimizing data workflows, ensuring high data quality, and supporting data-driven decision-making across the organization. Experience with Oracle ERP systems and knowledge of data migration to a data warehouse environment will be considered a valuable advantage. Years of Experience: 5 to 10 Years. Shift Timings: 1PM to 10PM IST. Skill Set • SQL: Advanced proficiency in writing optimized queries, working with complex joins, CTEs, window functions, etc. • DBT (Data Build Tool): Experience in modelling data with dbt, managing data transformations, and maintaining project structure. Python: Proficient in writing data processing scripts and building Airflow DAGs using Python. BigQuery: Strong experience with GCPs BigQuery, including dataset optimization, partitioning, and query cost management. Apache Airflow: Experience building and managing DAGs, handling dependencies, scheduling jobs, and error handling. Data Warehousing Concepts: Strong grasp of ETL/ELT, dimensional modelling (star/snowflake), fact/dimension tables, slowly changing dimensions, etc. Version Control: Familiarity with Git/GitHub for code collaboration and deployment. • Cloud Platforms: Working knowledge of Google Cloud Platform (GCP). Job Description Roles & Responsibilities: Design, build, and maintain robust ETL/ELT data pipelines using Python, Airflow, and DBT. Develop and manage dbt models to enable efficient, reusable, and well-documented data transformations. Collaborate with stakeholders to gather data requirements and design data marts comprising fact and dimension tables in a well-structured star schema. Manage and optimize data models and transformation logic in BigQuery, ensuring high performance and cost-efficiency. Implement and uphold robust data quality checks, logging, and alerting mechanisms to ensure reliable data delivery. Maintain the BigQuery data warehouse, including routine optimizations and updates. Enhance and support the data warehouse architecture, including the use of star/snowflake schemas, partitioning strategies, and data mart structures. Proactively monitor and troubleshoot production pipelines to minimize downtime and ensure data accuracy.

Posted 1 month ago

Apply

3 - 8 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Pune

Work from Office

Naukri logo

Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Lucknow

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

5 - 9 years

18 - 20 Lacs

Noida

Work from Office

Naukri logo

Experience: 5-7 Years Location-Noida Position: Data Analyst Technical Skills: Strong proficiency in Python (Pandas, NumPy, Matplotlib, Seaborn, etc.). Advanced SQL skills for querying large datasets. Experience with data visualization tools ( Looker, etc.). Hands-on experience with data wrangling, cleansing, and transformation. Familiarity with ETL processes and working with structured/unstructured data. Analytical & Business Skills: Strong problem-solving skills with the ability to interpret complex data. Business acumen to connect data insights with strategic decision-making. Excellent communication and presentation skills. Preferred (Nice to Have): Knowledge of machine learning concepts (scikit-learn, TensorFlow, etc.) Exposure to cloud platforms (GCP) for data processing.

Posted 1 month ago

Apply

4 - 9 years

15 - 19 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply

7 - 12 years

32 - 37 Lacs

Jaipur

Work from Office

Naukri logo

About The Role : Job Title: Analytics Senior Analyst LocationJaipur, India Corporate TitleAVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The teams purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Offices deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 month ago

Apply

10 - 15 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Introduction: We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Solution Architect . As a Solution Architect, you will play a pivotal role in designing and implementing cloud-based solutions for our team using GCP. The ideal candidate will have a deep understanding of cloud architecture, a proven track record of delivering cloud-based solutions, and experience with GCP technologies. You will work closely with technical teams and clients to ensure the successful deployment and optimization of cloud solutions. Responsibilities: Lead the design and architecture of GCP-based solutions, ensuring scalability, security, performance, and cost-efficiency. Collaborate with business stakeholders, engineering teams, and clients to understand technical requirements and translate them into cloud-based solutions. Provide thought leadership and strategic guidance on cloud technologies, best practices, and industry trends. Design and implement cloud-native applications, data platforms, and microservices on GCP. Ensure cloud solutions are aligned with clients business goals and requirements, with a focus on automation and optimization. Conduct cloud assessments, identifying areas for improvement, migration strategies, and cost-saving opportunities. Oversee and manage the implementation of GCP solutions, ensuring seamless deployment and operational success. Create detailed documentation of cloud architecture, deployment processes, and operational guidelines. Engage in pre-sales activities, including solution design, proof of concepts (PoCs), and presenting GCP solutions to clients. Ensure compliance with security and regulatory requirements in the cloud environment. Requirements: At least 2+ years of experience as a Cloud Architect or in a similar role with strong expertise in Google Cloud Platform. In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud Functions, and networking. Experience with infrastructure-as-code tools such as Terraform Strong understanding of cloud security, identity management, and compliance frameworks (e.g., GDPR, HIPAA). Hands-on experience with GCP networking, IAM, and logging/monitoring tools (Cloud Monitoring, Cloud Logging). Strong experience in designing and deploying highly available, fault-tolerant, and scalable solutions. Proficiency in programming languages like Java, Golang. Experience with containerization and orchestration technologies such as Docker, Kubernetes, and GKE (Google Kubernetes Engine). Experience in cloud cost management and optimization using GCP tools. Thanks, Pratap

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Mumbai

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Kanpur

Work from Office

Naukri logo

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills.Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 1 month ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications

Posted 1 month ago

Apply

3 - 5 years

9 - 18 Lacs

Chennai

Hybrid

Naukri logo

Role & responsibilities As a part of Tonik Data Analytics team candidate will be responsible for Develop and enhance the Data Lake Framework for ingestion of data from various sources and providing reports to downstream systems/users. Work with the different Tonik IT Teams to implement the data requirements. Develop Data Pipelines based on requirement on Key GCP Services i.e. BigQuery, Airflow, GCS using Python/SQL language. Ensure the proper GCP Standards are followed for the implementation. Preferred candidate profile Handson experience in any one of the Programming languages (Python, Java). Working experience in cloud platform (AWS/GCP/Azure). Experience in Design pattern & designing scalable solutions Handson experience in SQL and able to convert the requirements to a standard, scalable, cost effective and with better performance. Should be aware of the Data Engineering principals and Data pipeline techniques. Communicate effectively with stakeholders and other team members. Implemented E2E automated pipeline to ingest the data from different formats (csv, fixed width, json etc.) Closely work with various business teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and implementation approaches in the delivery. Experienced and should have hands-on experience in the following key offerings from GCP or equivalent services from AWS. Composer/Airflow, BigQuery, Dataflow, Cloud Storage, Apache Beam, Data Proc Should have good understanding on the Security related configuration and arrangements in BigQuery and how to handle the data securely while sharing. Nice to have exposure/knowledge on the ML and pipelines

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Nagpur

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 1 month ago

Apply

2 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Data Analyst an experienced analytics professional who is passionate about unleashing the power of data to inform decision-making, achieve strategic objectives, and support hiring and retention of world-class talent. As an integral part of the team, the Data Analyst will use analytical skills and business acumen to turn data into knowledge and drive business success. Requirements and Qualifications: Minimum 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering, preferably in a globally recognized organization. Expert-level proficiency in writing complex SQL queries to create views in data warehouses like Snowflake, Redshift, SQL Server, Oracle, or BigQuery. Advanced skills in designing and developing data models and dashboards using BI tools such as Tableau, Domo, Looker, etc. Intermediate-level skills with analytical tools such as Excel, Google Sheets, or Power BI (e.g., complex formulas, lookups, pivot tables). Bachelors or advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Mathematics, Statistics, or a related field. Willingness to collaborate with internal team members and stakeholders across different time zones. Roles and Responsibilities: Perform advanced analytics such as cohort analysis, scenario analysis, time series analysis, and predictive analysis, and create powerful data visualizations. Clearly articulate assumptions, data interpretations, and analytical findings in a variety of formats for different audiences. Design data models that define the structure and relationship of data elements across various sources based on reporting and analytics needs. Collaborate with BI Engineers to build scalable, high-performing reporting and analytics solutions. Write SQL queries to extract and manipulate data from warehouses such as Snowflake. Conduct data validation and quality assurance checks to ensure high standards of data integrity. Investigate and resolve data issues, including root cause analysis when inconsistencies arise in reporting.

Posted 1 month ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies