Home
Jobs

962 Bigquery Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Title: Senior Data Engineer Job Summary: We are looking for an experienced Senior Data Engineer with 5+ years of hands-on experience in cloud data engineering platforms, specifically AWS, Databricks, and Azure. The ideal candidate will play a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support our analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and optimize scalable data pipelines using AWS services (e.g., S3, Glue, Redshift, Lambda). Build and maintain ETL/ELT workflows leveraging Databricks and Apache Spark for processing large datasets. Work extensively with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and Azure Databricks. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver high-quality data solutions. Ensure data quality, reliability, and security across multiple cloud platforms. Monitor and troubleshoot data pipelines, implement performance tuning, and optimize resource usage. Implement best practices for data governance, metadata management, and documentation. Stay current with emerging cloud data technologies and industry trends to recommend improvements. Required Qualifications: 5+ years of experience in data engineering with strong expertise in AWS , Databricks , and Azure cloud platforms. Hands-on experience with big data processing frameworks, particularly Apache Spark. Proficient in building complex ETL/ELT pipelines and managing data workflows. Strong programming skills in Python, Scala, or Java. Experience working with structured and unstructured data in cloud storage solutions. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with CI/CD pipelines and DevOps practices in cloud environments. Strong analytical and problem-solving skills with an ability to work independently and in teams. Preferred Skills: Experience with containerization and orchestration tools (Docker, Kubernetes). Familiarity with machine learning pipelines and tools. Knowledge of data modeling, data warehousing, and analytics architecture.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years’ experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

In this role, you will play a key role in designing, building, and optimizing scalable data products within the Telecom Analytics domain. You will collaborate with cross-functional teams to implement AI-driven analytics, autonomous operations, and programmable data solutions. This position offers the opportunity to work with cutting-edge Big Data and Cloud technologies, enhance your data engineering expertise, and contribute to advancing Nokias data-driven telecom strategies. If you are passionate about creating innovative data solutions, mastering cloud and big data platforms, and working in a fast-paced, collaborative environment, this role is for you! You have: Bachelors or masters degree in computer science, Data Engineering, or related field with 8+ years of experience in data engineering with a focus on Big Data, Cloud, and Telecom Analytics. Hands-on expertise in Ab Initio for data cataloguing, metadata management, and lineage. Skills in data warehousing, OLAP, and modelling using BigQuery, Clickhouse, and SQL. Experience with data persistence technologies like S3, HDFS, and Iceberg. Hold on, Python and scripting languages. It would be nice if you also had: Experience with data exploration and visualization using Superset or BI tools. Knowledge in ETL processes and streaming tools such as Kafka. Background in building data products for the telecom domain and understanding of AI and machine learning pipeline integration. Data Governance: Manage source data within the Metadata Hub and Data Catalog. ETL Development: Develop and execute data processing graphs using Express It and the Co-Operating System. ETL Optimization: Debug and optimize data processing graphs using the Graphical Development Environment (GDE). API Integration: Leverage Ab Initio APIs for metadata and graph artifact management. CI/CD Implementation: Implement and maintain CI/CD pipelines for metadata and graph deployments. Team Leadership & Mentorship: Mentor team members and foster best practices in Ab Initio development and deployment.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Chennai

Work from Office

Naukri logo

Minimum Qualifications : BE or equivalent degree. - 4+ years of experience with Bigquery, Python, any RDBMS, GCP Cloud Platform - Total 8 years of work experience and current or previous experience leading a team. - 3+ years of experience in Python to build data engineering pipelines using libraries - 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. - Comfortable with a broad array of relational and non-relational databases. - Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) - Inquisitive, proactive, and interested in learning new tools and techniques. - Minimum 2+ years of hands-on experience in GCP and Scripting Preferred Skills : - Good have experience in Infrastructure as a Code (Terraform) and CI/CD Pipeline - Hands-on experience with 2+ GCP cloud services including, Dataflow, , Cloud Functions and Pub/Sub, Airflow etc. - Familiarity with big data and machine learning tools and platforms. - Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. - Should be able to guide the team (Tech anchor). - Strong oral, written and interpersonal communication skills - Comfortable working in a dynamic environment where problems are not always well-defined. Skills : Terraform, GCP ,Apache, Python,cloud, data,big Data, spark,rdbms

Posted 2 weeks ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Pune

Remote

Naukri logo

At Codvo, software and people transformations go together We are a global empathy-led technology services company with a core DNA of product innovation and mature software engineering We uphold the values of Respect, Fairness, Growth, Agility, and Inclusiveness in everything we do About The Role :We are looking for a Data & BI Solution Architect to lead data analytics initiatives in the retail domainThe candidate should be skilled in data modeling, ETL, visualization, and big data technologies Responsibilities:Architect end-to-end data and BI solutions for retail analytics Define data governance, security, and compliance frameworks Work with stakeholders to design dashboards and reports for business insights Implement data pipelines and integrate with cloud platforms Skills Required:Proficiency in SQL, Python, and Spark Experience with ETL tools (Informatica, Talend, AWS Glue) Knowledge of Power BI, Tableau, and Looker Hands-on experience with cloud data platforms (Snowflake, Redshift, BigQuery)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake. Strong understanding of SQL and database design principles. Experience with ETL tools and processes. Experience with programming languages such as Python or Java. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software solutions to meet business requirements. Collaborate with cross-functional teams to ensure successful application deployment. Conduct code reviews and provide technical guidance to junior team members. Stay updated on industry trends and best practices in application development. Assist in troubleshooting and resolving application issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of cloud computing concepts. Experience with SQL and database management systems. Knowledge of software development lifecycle methodologies. Hands-on experience in application design and development. Additional Information: The candidate should have a minimum of 3 years of experience in Google BigQuery. This position is based at our Chennai office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Nagpur

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : SAP FI CO Finance Good to have skills : SAP CO Product Cost Controlling Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role ::Sr. SAP S4H FICO Consultant Job Duties & ResponsibilitiesIn-depth SAP Solutions and process knowledge including industry best practicesLeads fit/gap and other types of working sessions to understand needs driven by business process requirements.Translate requirements into solutions, using SAP Best Practices or Navisite Solutions as a baseline.Leader of their respective workstream on assigned projects.Work in conjunction with Navisite Service Delivery Lead to establish the overall plan for their respective work for the customerSAP configuration experience primarily in the FI/CO modules.Configure SAP CO systems to meet client business requirements, including connection points with SD, PP, MM and other modules and implementation of SAP best practices. At least two full lifecycle implementations as an SAP CO functional consultant and minimum 5 support projects. S4 HANA Experience is a mustApply strong knowledge of the business processes for designing, developing, and testing SAP functions associated with financial operations, which includes expertise in cost center accounting (CCA), Internal Order Accounting (IOA), product cost controlling (CO-PC), profitability analysis (CO-PA), and profit center accounting (PCA). Focus on business process re-engineering efforts and technology enablement Serves as the subject matter expert on product systems, processes, network architecture and interface capabilities Should have in-depth understanding and execution skills in FI and CO sub modules SAP FI:FI General Ledger accounting, Accounts Receivables, Account Payables, Asset accounting Experience in developing specifications for Interfaces and Custom ReportsCreates functional specifications for development objects.Conducts unit testing on overall solution including technical objects.Supports integration testing and user acceptance testing with customer.Explores new SAP applications as a subject matter expert and may be first adopter for emerging SAP technologies.Supports Navisite Application Managed Services (AMS) by working and resolving tickets as assigned.Sustains adequate product knowledge through formal training, webinars, SAP publications, collaboration among colleagues and self-study.Enforce the core competencies and professional standards of Navisite in all client engagements.Supports internal projects as assigned.Collaborates with colleagues to grow product knowledge.Assists in the continual improvement of Navisite methods and tools.Adheres to Navisite professional standardsWilling to travel as per business needs Key Competencies:Customer FocusResults DrivenBusiness AcumenTrusted AdvisorTask ManagementProblem Solving SkillsCommunication SkillsPriority SettingPresentation SkillsMentorship and CollaborationAbility to work regularly scheduled shifts After-hours coverage for critical issues as needed Qualifications 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Microsoft 365, Microsoft 365 Security & Compliance Good to have skills : Microsoft PowerShell Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Tech Support Practitioner, you will act as the ongoing interface between the client and the system or application. You will be dedicated to quality, using exceptional communication skills to keep our world-class systems running. With your deep product knowledge, you will accurately define client issues and interpret and design resolutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Assist clients in troubleshooting and resolving technical issues. Collaborate with cross-functional teams to ensure smooth system operations. Provide technical support and guidance to clients. Identify and analyze system or application issues and propose solutions. Document and maintain accurate records of client interactions and issue resolutions. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft 365, Microsoft 365 Security & Compliance. Good To Have Skills:Experience with Microsoft PowerShell. Strong understanding of Microsoft 365 security and compliance features. Knowledge of Microsoft 365 administration and configuration. Familiarity with troubleshooting Microsoft 365 applications and services. Excellent problem-solving and analytical skills. Effective communication and interpersonal skills. Additional Information: The candidate should have a minimum of 3 years of experience in Microsoft 365. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills:Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

10 - 12 Lacs

Pune

Remote

Naukri logo

Job Responsibilities : Design and Develop Dashboards: Create visually appealing and interactive dashboards that help users quickly grasp critical insights from data. Data Integration : Connect Tableau to various data sources (Google Big Query/Azure Synapse, etc.), ensuring data accuracy and integrity. Performance Optimization : Improve load times and responsiveness of Tableau dashboards and reports. Data Analysis: Analyse and interpret data to create meaningful visualizations. Collaboration: Work closely with stakeholders to understand and translate business requirements into functional specifications. Skills: Proficiency in Tableau : Strong understanding of Tableau Desktop and Tableau Server and respective functionalities. Minimum 5 years of experience in Tableau. JavaScript: Must have experience in customising (handling of events, filters, etc.) the Tableau dashboards using Tableau Embedding API in JavaScript. Technical Skills : Knowledge of SQL and should have experience connecting to a Data warehouse. Cloud: Experience working in a cloud-based environment. Communication: Excellent communication skills to effectively collaborate with stakeholders and present data insights. Education: Bachelors degree in computer science, Information Systems, or a related field (or equivalent experience).

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

GCP Looker Developer GCP Business Data Analyst Data Engineer - GCP - Big Query GCP Application Developer Required Candidate profile Good knowledge GCP Looker Developer GCP Business Data Analyst Data Engineer - GCP - Big Query GCP Application Developer

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

We are looking for a Senior GCP Data Engineer / GCP Technical Lead with strong expertise in Google Cloud Platform (GCP), Apache Spark, and Python to join our growing data engineering team. The ideal candidate will have extensive experience working with GCP data services and should be capable of leading technical teams, designing robust data pipelines, and interacting directly with clients to gather requirements and ensure project delivery. Project Duration : 1 year and extendable Role & responsibilities Design, develop, and deploy scalable data pipelines and solutions using GCP services like DataProc and BigQuery. Lead and mentor a team of data engineers to ensure high-quality deliverables. Collaborate with cross-functional teams and client stakeholders to define technical requirements and deliver solutions aligned with business goals. Optimize data processing and transformation workflows for performance and cost-efficiency. Ensure adherence to best practices in cloud data architecture, data security, and governance. Mandatory Skills: Google Cloud Platform (GCP) especially DataProc and BigQuery Apache Spark Python Programming Preferred Skills: Experience in working with large-scale data processing frameworks. Exposure to DevOps/CI-CD practices in a cloud environment. Hands-on experience with other GCP tools like Cloud Composer, Pub/Sub, or Cloud Storage is a plus. Soft Skills: Strong communication and client interaction skills. Ability to work independently and as part of a distributed team. Excellent problem-solving and team management capabilities.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

1 - 1 Lacs

Bengaluru

Remote

Naukri logo

Note: (Big query and JDK) must have for this role Scope of the project • Strong proficiency in Java (including reading and writing complex code) • Experience with Spring Framework and MVC architecture • Proficient in Maven, Tomcat, and Java 17

Posted 2 weeks ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Data Management - Google BigQuery Designation: Data Eng, Mgmt & Governance Sr Analyst Qualifications: BE/BTech Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Data & AIA cloud-based big data analytics web service for processing large read-only data sets. Designed for analyzing data on the order of large rows, using a SQL-like syntax. What are we looking for GCP Big Query Data Modeling & Warehousing Structured Query Language (SQL) ETL (Matillion/ SSIS /Alteryx) Adaptable and flexible Ability to work well in a team Commitment to quality Agility for quick learning Strong analytical skills Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification BE,BTech

Posted 2 weeks ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Ariba Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring that business operations run smoothly and efficiently. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Ariba.- Strong understanding of application support processes and methodologies.- Experience with troubleshooting and resolving software issues.- Familiarity with system integration and data flow management.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Ariba.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 22 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

•Minimum 4+ years of experience implementing data migration programs from Hadoop with Java, Spark to GCP BigQuery and Dataproc •Minimum experience of 4+ years in Integrating plugins for GitHub Action to CICD platform to ensure software quality

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud Platform Architecture Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and develop applications.- Implement best practices for application development.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated with the latest technologies and trends in application development.- Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture.- Strong understanding of cloud computing principles.- Experience with designing scalable and secure cloud-based applications.- Hands-on experience with Google Cloud services such as Compute Engine, BigQuery, and Cloud Storage.- Knowledge of DevOps practices for continuous integration and deployment. Additional Information:- The candidate should have a minimum of 3 years of experience in Google Cloud Platform Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad

Remote

Naukri logo

Job Description: We are seeking experienced Python developers with hands-on expertise in Google Cloud Vertex AI. The ideal candidate will have a strong background in machine learning model development, deployment pipelines, and cloud-native applications. Key Skills: Advanced proficiency in Python Experience with Vertex AI (training, deployment, pipelines, model registry) Familiarity with Google Cloud Platform (GCP) services like BigQuery, Cloud Functions, AI Platform Understanding of ML lifecycle, including data preprocessing, training, evaluation, and monitoring CI/CD experience with ML workflows (e.g., Kubeflow, TFX, or Vertex Pipelines) Preferred: Experience integrating Vertex AI with DBT, Airflow, or Looker Exposure to MLOps and model governance

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Position Description: Your future duties and responsibilities: Job Description Candidate will have a strong background in .NET Core, Angular, and possess experience working within the media domain. Additionally, experience with TM Forum Open APIs and GCP skills would be a significant advantage. Responsibilities: Design, develop, and maintain high-quality .NET Core applications using best practices and industry standards. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Leverage Angular framework to build robust and user-friendly web interfaces. Integrate with TM Forum Open APIs to facilitate interoperability with other systems. Utilize GCP services and technologies to optimize application performance and scalability. Provide technical guidance and mentorship to junior team members. Stay updated on the latest .NET Core, Angular, and media adtech trends and technologies. Required Skills and Experience: Strong proficiency in .NET Core and C# programming languages. In-depth knowledge of Angular framework and its ecosystem. Experience working with media adtech platforms or related domains. Understanding of TM Forum Open APIs and their applications. Proficiency in GCP services and technologies (e.g., Cloud Functions, App Engine, BigQuery). Excellent problem-solving and debugging skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Experience in versioning tools GitLab, TFS Preferred Skills and Experience: Experience with containerization technologies (e.g., Docker, Kubernetes). Knowledge of cloud-native development practices. Experience with microservices architecture. Familiarity with Agile methodologies (e.g., Scrum, Kanban). Skills: Angular .NET .Net Remoting .Net Reporting SQLite Telecommunications.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

13 - 17 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Nagpur

Work from Office

Naukri logo

Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.

Posted 2 weeks ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies