Home
Jobs

41 Big Query Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

5 - 8 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities What you'll do Engineer, test, document and manage GCP Dataproc, DataFlow and VertexAI services used in high-performance data processing pipelines and Machine Learning. Help developers optimize data processing jobs using Spark, Python, and Java. Collaborate with development teams to integrate data processing pipelines with other cloud services and applications. Utilize Terraform and Tekton for infrastructure as code (IaC) and CI/CD pipelines, ensuring efficient deployment and management. Good to have Experience with Spark for large-scale data processing. Solid understanding and experience with GitHub for version control and collaboration. Experience with Terraform for infrastructure management and Tekton for continuous integration and deployment. Experience with Apache NiFi for data flow automation. Knowledge of Apache Kafka for real-time data streaming. Familiarity with Google Cloud Pub/Sub for event-driven systems and messaging. Familiarity with Google BigQuery Mandatory Key Skills Python,Java,Google Cloud Pub/Sub,Apache Kafka,Big Query,CI/CD*,Machine Learning*,Spark*

Posted 4 weeks ago

Apply

7.0 - 12.0 years

19 - 22 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment. Mandatory Key Skills Big Query,Astronomer,Terraform,Airflow,Cloud Architecture,Java,Google Cloud Platform*,Python*

Posted 4 weeks ago

Apply

5.0 - 7.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills agile development,Data Processing,Python,Shell Script,SQL,Google Cloud Platform*,GCS*,DataProc*,Big Query*,Data Flow*

Posted 4 weeks ago

Apply

7.0 - 9.0 years

19 - 22 Lacs

Chennai

Work from Office

Naukri logo

This role is for 7+ years experienced Software Engineer with data engineering knowledge and following skill set. 1.) End 2 End Full Stack 2.) GCP - Services like Big Query, Astronomer, Terraform, Airflow, Data flow, GCP Architecture 3.) Python Fullstack Java with Cloud Mandatory Key Skills Software Engineering,Big Query,Terraform,Airflow,Data flow,GCP Architecture,Java,Cloud,data engineering*

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Company Overview Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management. Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500. Department Overview Advanced Analytics and Automation Team builds automation, analytics and artificial intelligencesolutions that drive success for Cardinal Health by creating material savings, efficiencies, and revenue growth opportunities.The team drives business innovation by leveraging emerging technologies and turning them into differentiating business capabilities. We are seeking an experienced to join the Cloud Data Engineering team. This is an agile team focused on building various Platform Engineering and Data ingestion in Google Cloud. Responsibilities for this role: Provide monitoring, support and maintenance of Cloud Data Engineering Platform. Supports global 24x7 operations team in technical support for business operations by proactive overseeing, researching, and resolving production issues in automation solutions and platforms. Leading SRT calls on escalated production issues. Monitor and maintain weekly and monthly operation KPI and metrices. Supports efforts to define monitoring metrics and practices of Data Platforms Ensure right levels of Logging/ monitoring/ alerting in solutions to help operations. Analyze system monitoring data and make the corrections/enhancements to the Run processes of Data and ML Ops Platform. Work on performance optimization and maintenance by following operational procedures Co-ordinate with developers and support resources to meet the SLA defined in Support governance document. Setup continuous monitoring solutions for projects Contribute to continuous improvement initiatives for the Data platform operations. Qualifications: Bachelors degree in computer science, Information Systems, related technical degree, or equivalent industry experience, and a minimum five years of IT experience 5+ years of experience in Production Operations support experience 3+ years of hands-on experience in Data Analytics and Data Integration related fields 5+ years experience with support and monitoring team experience for Data Engineering / Integration/ Data Platform teams. Working experience/knowledge in infrastructure/application architecture. Working experience/knowledge in Public Cloud Infrastructure (GCP preferred). Working experience/knowledge of Cloud technologies & Services (GCP preferred) Python & Terraform experience is add on and nice to have Agile development skills and experience. Experience with CI/CD pipelines such as Concourse, Jenkins Google Cloud Platform certification is a plus Excellent analytical ability to trouble shoot issues in Production environment Outstanding work ethic and commitment to organization success. Willingness to adapt to and self-learn new technologies and deliver on them Excellent verbal and written communication skills with ability to clearly articulate status of requests and issues both with IT and business partners

Posted 4 weeks ago

Apply

4.0 - 9.0 years

9 - 19 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Key Responsibilities - Python & PySpark: - Writing efficient ETL (Extract, Transform, Load) pipelines. - Implementing data transformations using PySpark DataFrames and RDDs. - Optimizing Spark jobs for performance and scalability. - Apache Spark: - Managing distributed data processing. - Implementing batch and streaming data processing. - Tuning Spark configurations for efficient resource utilization. - Unix Shell Scripting: - Automating data workflows and job scheduling. - Writing shell scripts for file management and log processing. - Managing cron jobs for scheduled tasks. - Google Cloud Platform (GCP) & BigQuery: - Designing data warehouse solutions using BigQuery. - Writing optimized SQL queries for analytics. - Integrating Spark with BigQuery for large-scale data processing

Posted 1 month ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Chennai

Remote

Naukri logo

modern data warehouse (Snowflake, Big Query, Redshift) and graph databases. designing and building efficient data pipelines for the ingestion and transformation of data into a data warehouse Proficiency in Python, dbt, git, SQL, AWS and Snowflake.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Hybrid

Naukri logo

Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree

Posted 1 month ago

Apply

5.0 - 7.0 years

3 - 5 Lacs

Chennai

Work from Office

Naukri logo

Duration: 12Months Position Description: Experience with SQL and data warehousing Experience using tools in BI, ETL, ReportingVisualization/Dashboards. - QlikSense, PowerBI Should have good understanding on the visualization techniques and ability to solve business problems through visualization. Ability to get Insights from Data, provide visualization, and story-telling. Experience with data handling using SAS, R or Python, added advantage. Exposure to Bigdata based analytical solutions and hands-on experience with data lakes/ data cleansing/ data management Skills Required: Python, Big Query, PostgreSQL, Data/Analytics dashboards Skills Preferred: GCP Cloud Run, GCP Experience Required: 5-7 years of experience in Data Visualization and Insights Education Required: Bachelor's Degree Education Preferred: Master's Degree

Posted 1 month ago

Apply

15.0 - 20.0 years

7 - 17 Lacs

Hyderabad, Bangalore Rural, Chennai

Work from Office

Naukri logo

Job Description GCP Centric Solution Architect & Java/GCP Salesforce Integration Architects GCP CLOUD (Mandate) Worked on Large Scale Modernization Programs in Last 3-4 Years (Mandate) Worked Extensively in C4 Modelling (Mandate) Able to Present Architecture Options with Recommendations along with Justifications (Mandate) Well Versed in Handling Multiple Stakeholders-Leadership, Application Architects, SI partners, Vendor Team, Product team (Mandate) Location : PAN INDIA Expereince : 15+ Years General Shift (9 am-6 pm) Immediate Joiner to 30 Days

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics.

Posted 1 month ago

Apply

5 - 9 years

1 - 2 Lacs

Hyderabad, Gurugram, Bengaluru

Hybrid

Naukri logo

Role BI Specialist (Looker Admin) Location – Gurgaon, Hyderabad, Bangalore (Hybrid mode) Key Skill – Looker, Big Query, SQL, LookML Exp- 5 to 9yrs JD- Roles and responsibilities Participate in business analysis activities to gather business needs, translate them to technical specifications and drive implementation of reports, dashboards, KPI scorecards etc. that provide insights for strategic and tactical decision making Design, develop, test and deploy reports and dashboards using data from various data stores/data warehouses Provide ongoing operational support as necessary and ensure availability and performance of BI reports and dashboards Ensure user security through proper authentication and authorization for the Power BI environment Develop and execute database queries for analysis and ad hoc reports as requested Optimize queries and capacity to improve report performance Research solutions to implement new and/or enhance existing reporting processes Educate and onboard end users on usage and capabilities of Power BI as required Monitor and address data quality issues Must have Skills Minimum of 6-8 years authoring high performing, reliable, scalable and secure data visualizations and dashboards Strong SQL knowledge is a must. Experience designing database schemas and optimizing query performance is required Good administration experience in visualization tools, at least 2+ years of experience in Looker admin Ability to interpret database schemas for extracting data for repo rts Deep understanding of database fundamentals, including relational database design, multidimensional database design Experience with exporting and integrating PowerBI reports with other platforms Organized with a proven ability to prioritize workload, meet deadlines, and utilize time effectively

Posted 1 month ago

Apply

4 - 8 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Web Development Associate Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Serve as a core member of the secure coding product team that enables the design, development, and creation of secure coding practices Develop application software and RESTful services using GCP and Spring Framework. Experience building distributed, service oriented, cloud micro service-based architectures. Use of Test-Driven Development and code pairing/mobbing practices. Develop components across all tiers of the application stack. Continuously integrates and deploys developed software. Modify CI/CD pipeline and scripts as necessary to improve continuous integration practices. Consults with product manager to identify minimal viable product and decomposes features by story slicing. Collaborate with other product teams on integrations, testing, and deployments Skills Required: React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services Experience Required: Experience in cloud services engineering, including Pivotal Cloud Foundry (GCP' J-Frog, GitHub, Spring, Angular), RESTful services, CI/CD pipeline (Tekton or similar). Experience with Swagger, logging/tracing, Conformance, Dynatrace, Spring security, and SonarQube Understanding Spring Cloud Data, Spring Security, OAuth, Service monitoring on Cloud Experience in application testing, release management, and support activities. Experience with various Software Development Life Cycle methods such as Agile Experience Preferred: 4+ years of development experience (Purchasing/Automotive industry experience a plus) preferably utilizing Java, Spring, Angular, React, Web Services, etc. 3 years of experience designing and building technical solutions using Java technologies such as Spring, Spring Boot, Web Services, Microservice Architecture etc. Comprehensive understanding of relational database (Microsoft SQL Server), PostgreSQL, NoSQL database and flat file processing concepts Strong knowledge in design patterns and principles, experience in developing web services, REST APIs, and related architectures Exposure to automated testing concepts, tools, and frameworks Excellent communications skills - ability to engage in deep technical discussions with customers and peers and become a trusted technical advisor Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Chennai

Work from Office

Naukri logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Senior Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Primary SMW SME to FEDE Security esp. in UARS/UAMS areas. Skills Required: Big Query, POSTGRES, AIRFLOW, Tekton, TERRAFORM Experience Required: 5 TO 8 YRS Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Chennai

Work from Office

Naukri logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Senior Location: Chennai Duration: 12 Months Work Type: Onsite Position Description: Primary SMW SME to FEDE Security esp. in UARS/UAMS areas. Skills Required: CP4D, POSTGRES, BIG QUERY Experience Required: 5 TO 8 YRS Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 1 month ago

Apply

11 - 21 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies