Home
Jobs

962 Bigquery Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new features- Conduct code reviews and ensure coding standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of data modeling and database design- Experience with cloud-based data warehousing solutions- Hands-on experience with SQL and query optimization- Knowledge of ETL processes and data integration Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions and collaborating with various teams to ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of cloud computing concepts- Experience in developing scalable and efficient applications- Knowledge of database management systems- Hands-on experience with data modeling and optimization techniques Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Python (Programming Language), MySQL, Google BigQuery.- Strong understanding of distributed computing principles and frameworks.- Experience in developing data processing pipelines and ETL processes.- Familiarity with cloud platforms and services related to big data processing. Additional Information:- The candidate should have minimum 7.5 years of experience in Apache Spark.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices, ultimately contributing to the success of the projects you oversee. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and architecture.- Experience with SQL and data manipulation techniques.- Familiarity with cloud computing platforms and services.- Ability to design and implement ETL processes for data integration. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), Apache Spark, Google BigQueryMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Apache Spark, Python (Programming Language), Google BigQuery.- Strong understanding of data processing frameworks and distributed computing.- Experience in developing and deploying scalable applications.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 7.5 years of experience in PySpark.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL)Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing new technologies- Conduct regular code reviews to ensure quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL)- Strong understanding of data warehousing concepts- Experience in optimizing query performance in large datasets- Knowledge of ETL processes and data modeling Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google BigQuery- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and architecture.- Experience with SQL and database management.- Familiarity with application development frameworks and methodologies.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Mumbai. You will play a crucial role in developing innovative solutions to drive business success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of cloud-based data warehousing solutions- Experience in designing and optimizing large-scale data pipelines- Hands-on experience with SQL and data modeling- Knowledge of data security and compliance standards Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google BigQuery- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in Mumbai. You will play a crucial role in developing innovative solutions to address business needs and enhance user experience. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application security Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of data analytics and data modeling- Experience with cloud-based data warehousing solutions- Hands-on experience in SQL and database management- Knowledge of ETL processes and data integration Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing application features, and ensuring that the applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 12 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies; defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud FunctionCloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem. Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Kolkata, Gurugram, Bengaluru

Work from Office

Naukri logo

Job Opportunity for GCP Data Engineer Role: Data Engineer Location: Gurugram/ Bangalore/Kolkata (5 Days work from office) Experience : 4+ Years Key Skills: Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualization - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills: • Cloud - GCP - Expert • Database systems (SQL and NoSQL / Big Query / DBMS) - Expert • Data warehousing solutions - Advanced • ETL Tools - Advanced • Data APIs - Advanced • Python, Java, and Scala etc. - Intermediate • Some knowledge understanding the basics of distributed systems - Foundation • Some knowledge of algorithms and optimal data structures for analytics - Foundation • Soft Skills and time management skills - Foundation

Posted 2 weeks ago

Apply

7.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Architect / Engagement Lead Location: Chennai Reports To: CEO About the Company: Ignitho Inc. is a leading AI and data engineering company with a global presence, including US, UK, India, and Costa Rica offices. Visit our website to learn more about our work and culture: www.ignitho.com. Ignitho is a portfolio company of Nuivio Ventures Inc ., a venture builder dedicated to developing Enterprise AI product companies across various domains, including AI, Data Engineering, and IoT. Learn more about Nuivio at: www.nuivio.com. Job Summary: As the Data Architect and Engagement Lead, you will define the data architecture strategy and lead client engagements , ensuring alignment between data solutions and business goals . This dual role blends technical leadership with client-facing responsibilities. Key Responsibilities: Design scalable data architectures, including storage, processing, and integration layers. Lead technical discovery and requirements gathering sessions with clients. Provide architectural oversight for data and AI solutions . Act as a liaison between technical teams and business stakeholders . Define data governance, security, and compliance standards . Required Qualifications: Bachelors or Masters in computer science, Information Systems, or similar. 7+ years of experience in data architecture, with client-facing experience. Deep knowledge of data modelling , cloud data platforms (Snowflake / BigQuery/ Redshift / Azure), and orchestration tools. Excellent communication, stakeholder management, and technical leadership skills. Familiarity with AI/ML systems and their data requirements is a strong plus.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

15 - 25 Lacs

Kolkata

Work from Office

Naukri logo

Job Summary We are seeking a skilled Cloud Engineer with 2 to 5 years of experience to join our dynamic team. The ideal candidate will have expertise in Operation suite Container Analysis Cloud Deploy Artifact Registry Cloud Build Cloud Run and Google Kubernetes Engine. Experience in the Food services or Airlines domain is a plus. This is a hybrid work model with day shifts and no travel required. Responsibilities Design implement and manage cloud infrastructure using Google Cloud Platform services. Oversee the deployment and maintenance of applications on Google Kubernetes Engine. Provide support for container analysis and ensure security compliance. Utilize Cloud Build and Cloud Deploy for continuous integration and continuous delivery pipelines. Manage and maintain Artifact Registry for efficient storage and retrieval of container images. Implement and manage serverless applications using Cloud Run. Monitor and optimize cloud resources to ensure high availability and performance. Collaborate with cross-functional teams to understand and address their cloud infrastructure needs. Troubleshoot and resolve issues related to cloud infrastructure and services. Develop and maintain documentation for cloud infrastructure and processes. Stay updated with the latest cloud technologies and best practices. Contribute to the improvement of cloud infrastructure and deployment processes. Ensure that cloud infrastructure aligns with company goals and objectives. Qualifications Possess a strong understanding of Google Cloud Platform services. Have hands-on experience with Operation suite Container Analysis Cloud Deploy Artifact Registry Cloud Build Cloud Run and Google Kubernetes Engine. Demonstrate the ability to design and implement scalable and secure cloud infrastructure. Show proficiency in troubleshooting and resolving cloud infrastructure issues. Exhibit excellent collaboration and communication skills. Have experience in the Food services or Airlines domain (nice to have). Display a proactive approach to learning and implementing new cloud technologies. Certifications Required Google Cloud Professional Cloud Architect Google Cloud Professional DevOps Engineer

Posted 2 weeks ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Whats in it for you? Pay above market standards The role is going to be contract based with project timelines from 2 12 months, or freelancing Be a part of an Elite Community of professionals who can solve complex AI challenges Work location could be: Remote (Highly likely) Onsite on client location Deccan AIs Office: Hyderabad or Bangalore Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities What are the next steps? Register on our Soul AI website

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida

Remote

Naukri logo

Role & responsibilities As a Data Engineer with a focus on pipeline migration from SAS to Google Cloud Platform (GCP) technologies, you will tackle intricate problems and create value for our business by designing and deploying reliable, scalable solutions tailored to the companys data landscape. You will be responsible for the development of custom-built data pipelines on the GCP stack, ensuring seamless migration of existing SAS pipelines. Responsibilities: Design, develop, and implement data pipelines on the GCP stack, with a focus on migrating existing pipelines from SAS to GCP technologies. Develop modular and reusable code to support complex ingestion frameworks, simplifying the process of loading data into data lakes or data warehouses from multiple sources. Collaborate with analysts and business process owners to translate business requirements into technical solutions. Utilize your coding expertise in scripting languages (Python, SQL, PySpark) to extract, manipulate, and process data effectively. Leverage your expertise in various GCP technologies, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI, to enhance data warehousing solutions. Maintain high standards of development practices, including technical design, solution development, systems configuration, testing, documentation, issue identification, and resolution, writing clean, modular, and sustainable code. Understand and implement CI/CD processes using tools like Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Participate in data quality and validation processes to ensure data integrity and reliability. Optimize performance of data pipelines and storage solutions, addressing bottlenecks. Collaborate with security teams to ensure compliance with industry standards for data security and governance. Communicate technical solutions engineering teams and business stakeholders. Required Skills & Qualifications: 5-13 years of experience in software development, data engineering, business intelligence, or a related field, with a proven track record in manipulating, processing, and extracting value from large datasets. Extensive experience with GCP technologies in the data warehousing space, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI. Proficient in Python, SQL, and PySpark for data manipulation and pipeline creation. Experience with SAS, SQL Server, and SSIS is a significant advantage, particularly for transitioning legacy systems to modern GCP solutions. Ability to develop reusable, modular code for complex ingestion frameworks and multi-use pipelines. Understanding of CI/CD processes and tools, such as Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Proven experience in migrating data pipelines from SAS to GCP technologies. Strong problem-solving abilities and a proactive approach to identifying and implementing solutions. Familiarity with industry best practices for data security, data governance, and compliance in cloud environments. Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills, with the ability to advocate for technical solutions to a diverse audience including engineering teams, and business stakeholders. Willingness to work in the afternoon shift from 3 PM to 12 AM IST.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 2 weeks ago

Apply

3.0 - 8.0 years

8 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities ualifications Experience - 3-6 years Education - B.E/B.Tech/MCA/M.Tech Minimum Qualifications • Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field) • 3 years of experience in software development or a related field • 2 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) You will be responsible for designing, building, and maintaining our data infrastructure, ensuring data quality, and enabling data-driven decision-making across the organization. The ideal candidate will have a strong background in data engineering, excellent problem-solving skills, and a passion for working with data. Responsibilities: • Design, build, and maintain our data infrastructure, including data pipelines, warehouses, and databases • Ensure data quality and integrity by implementing data validation, testing, and monitoring processes • Collaborate with cross-functional teams to understand data needs and translate them into technical requirements • Develop and implement data security and privacy policies and procedures • Optimize data processing and storage performance, ensuring scalability and reliability • Stay up-to-date with the latest data engineering trends and technologies • Provide mentorship and guidance to junior data engineers and analysts • Contribute to the development of data-driven solutions and products Requirements: • 3+ years of experience in data engineering, with a Bachelor's degree in Computer Science, Engineering, or a related field • Strong knowledge of data engineering tools and technologies, including SQL, and GCP • Experience with big data processing frameworks, such as Spark or Hadoop or Python • Experience with data warehousing solutions : BigQuery • Strong problem-solving skills, with the ability to analyze complex data sets and identify trends and insights • Excellent communication and collaboration skills, with the ability to work with cross-functional teams and stakeholders • Strong data security and privacy knowledge and experience • Experience with agile development methodologies is a plus Preferred candidate profile 3-4 yrs Max 12 LPA budget 4-6 yrs Max 14-16 LPA budget

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 - 1 Lacs

Hyderabad

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, and maintain scalable and efficient cloud infrastructure on Google Cloud Platform (GCP) using Kubernetes Engine, Cloud Run, and other services. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop automation scripts using Ansible or Terraform to deploy applications on GCP. Troubleshoot issues related to application deployment, networking, storage, and compute resources. Ensure compliance with security best practices and company policies.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Kochi, Bhubaneswar, Indore

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

13 - 23 Lacs

Noida, Kolkata, Pune

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

6+ years of professional experience Experience developing microservices and cloud native apps using Java/J2EE, REST APIs, Spring Core, Spring MVC Framework, Spring Boot Framework JPA (Java Persistence API) (Or any other ORM), Spring Security and similar tech stacks (Open source and proprietary) Experience working with Unit testing using framework such as Junit, Mockito, JBehave Build and deploy services using Gradle, Maven, Jenkins etc. as part of CI/CD process Experience working in Google Cloud Platform Experience with any Relational Database (Oracle, PostgreSQL etc.)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Good hands on experience working as a GCP Data Engineer with very strong experience in SQL and PySpark. Also on BigQuery, Dataform, Dataplex, etc. Looking for only Immediate to currently serving candidates.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai, Guindy, Chenai

Work from Office

Naukri logo

Data Modeller Chennai - Guindy, India Information Technology 17074 Overview A Data Modeller is responsible for designing, implementing, and managing data models that support the strategic and operational needs of an organization. This role involves translating business requirements into data structures, ensuring consistency, accuracy, and efficiency in data storage and retrieval processes. Responsibilities Develop and maintain conceptual, logical, and physical data models. Collaborate with business analysts, data architects, and stakeholders to gather data requirements. Translate business needs into efficient database designs. Optimize and refine existing data models to support analytics and reporting. Ensure data models support data governance, quality, and security standards. Work closely with database developers and administrators on implementation. Document data models, metadata, and data flows. Requirements Bachelors or Masters degree in Computer Science, Information Systems, Data Science, or related field. Data Modeling Tools: ER/Studio, ERwin, SQL Developer Data Modeler, or similar. Database Technologies: Proficiency in SQL and familiarity with databases like Oracle, SQL Server, MySQL, PostgreSQL. Data Warehousing: Experience with dimensional modeling, star and snowflake schemas. ETL Processes: Knowledge of Extract, Transform, Load processes and tools. Cloud Platforms: Familiarity with cloud data services (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Metadata Management & Data Governance: Understanding of data cataloging and governance principles. Strong analytical and problem-solving skills. Excellent communication skills to work with business stakeholders and technical teams. Ability to document models clearly and explain complex data relationships. 5+ years in data modeling, data architecture, or related roles. Experience working in Agile or DevOps environments is often preferred. Understanding of normalization/denormalization. Experience with business intelligence and reporting tools. Familiarity with master data management (MDM) principles.

Posted 2 weeks ago

Apply

10.0 - 18.0 years

10 - 15 Lacs

Chennai, Chenai

Work from Office

Naukri logo

Product Tech Lead Chennai, India Information Technology 16282 Overview Position Overview: We are seeking an experienced Technical Product Lead with 10-18 years of experience who embodies a product mindset and has a strong ability to design, implement, and support product features. The ideal candidate will have deep expertise in Python , PySpark , Node.js , and cloud technologies, alongside a proven track record of leading and mentoring teams of 10+ members. This role involves not only technical leadership but also contributing to the product lifecyclefrom feature design to troubleshooting product issues in production environments. Responsibilities Key Responsibilities: Leadership & Product-Oriented Team Management: Lead a team of 10+ engineers, fostering a culture of collaboration, ownership, and high-performance delivery with a focus on building features that meet product goals. Provide technical guidance and mentorship to team members, resolving complex product-related technical issues, and driving continuous improvement. Ensure the team is aligned with product requirements and objectives, translating these into technical designs and tasks. Product Feature Design & Development: Design and implement scalable, high-performance product features using Python , PySpark , and Node.js , ensuring they meet customer needs and business objectives. Collaborate with product managers, business stakeholders, and other technical teams to define and refine product features. Ensure the technical feasibility of product features and functionality, from ideation to deployment. Product Issue Support & Troubleshooting: Act as a go-to technical expert, supporting the team and the product in resolving any issues that arise in production. Troubleshoot product-related technical challenges, offering solutions and optimizations to improve product performance and user experience. Ensure that the product is delivered with the highest level of quality and reliability. Cloud Integration & Infrastructure: Architect and deploy cloud-native applications, ensuring scalability, security, and efficiency for the product. Ensure the cloud infrastructure aligns with the evolving needs of the product. Stakeholder Collaboration & Communication: Collaborate closely with product managers, business teams, and other technical teams to translate business requirements into technical solutions. Communicate progress, challenges, and product solutions clearly and effectively to both technical and non-technical stakeholders. Requirements Mandatory Skills & Experience: Core Technical Skills: Strong expertise in Python , PySpark , and Node.js . Comprehensive knowledge of cloud platforms (e.g., AWS, GCP, or Azure). Product Mindset & Design: Strong ability to understand the end-to-end product lifecycle, including design, development, and post-launch support. Proven experience in designing product features that align with business objectives and customer needs. Ability to collaborate with product teams to prioritize and refine features and functionality. Cloud & Product Support Expertise: Hands-on experience with cloud platforms and services, including deployment, scaling, and performance optimization in the cloud. Ability to troubleshoot complex product issues and provide timely, effective solutions. Leadership & Problem-Solving: Proven experience in managing and mentoring a team of 10 or more engineers, ensuring they have the tools and support to succeed. Ability to help the team navigate technical challenges, offering guidance and solutions. Strong problem-solving skills with a focus on delivering product-driven solutions. Preferred Skills (Nice to Have): Familiarity with Google Cloud Services (GCS) for storage solutions. Knowledge of BigQuery (BQ) for data analytics and processing. Experience with Apache Airflow for orchestrating workflows. Exposure to Google Cloud Dataproc for processing large-scale data. Understanding of Generative AI (GenAI) technologies and their application in product development. Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 10-15 years of hands-on experience in software development, technical leadership, and product-oriented roles. Expertise in modern software development practices, cloud computing, and agile methodologies.

Posted 2 weeks ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies