Jobs
Interviews

73 Google Bigquery Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery, Microsoft SQL Server, GitHub, Google Cloud Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Roles & Responsibilities:- 1:Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 2:Proven track record of delivering data integration, data warehousing soln 3: Strong SQL And Hands-on (No FLEX) 4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 5:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes 6:Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : Professional & Technical Skills: - 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor Show more Show less

Posted 1 day ago

Apply

3.0 - 8.0 years

3 - 12 Lacs

Pune, Maharashtra, India

On-site

Job Summary: We are seeking a detail-oriented and technically proficient BigQuery Project Administrator with 3+ years of experience on Google Cloud Platform (GCP), specifically BigQuery. This role focuses on project and cost governance, query optimization, and performance improvements. You will collaborate closely with engineering, architecture, and finance teams to ensure efficient, scalable, and cost-effective use of BigQuery resources. Key Responsibilities: Optimization & Performance Tuning Analyze query patterns, access logs, and usage metrics to recommend schema optimizations (partitioning, clustering, materialized views) Identify opportunities to improve BigQuery performance and reduce storage/computation costs Collaborate with engineers to refactor inefficient queries and optimize data workloads Project & Cost Governance Monitor and manage BigQuery project structures, billing configurations, quotas, and resource usage Implement and enforce cost control measures, quotas, and budget alerts Create regular reports on usage, cost trends, and anomalies for stakeholders Collaboration & Support Serve as a bridge between engineering and finance teams for BigQuery-related planning and decisions Support onboarding of new teams into the BigQuery environment Provide training and guidance on best practices and cost-efficient usage Best Practices & Compliance Define and advocate for BigQuery usage standards and best practices Ensure adherence to data governance, security, and privacy policies Maintain documentation on project setup, governance workflows, and optimization strategies Qualifications: 3+ years of hands-on experience with Google Cloud Platform (GCP), with a focus on BigQuery Strong understanding of SQL, data warehousing, and cloud cost management Experience with GCP billing, IAM (Identity and Access Management), and resource configuration Preferred Certifications: Google Cloud Professional Data Engineer

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be working as an AI Platform Engineer in Bangalore as part of the GenAI COE Team. Your key responsibilities will involve developing and promoting scalable AI platforms for customer-facing applications. It will be essential to evangelize the platform with customers and internal stakeholders, ensuring scalability, reliability, and performance to meet business needs. Your role will also entail designing machine learning pipelines for experiment management, model management, feature management, and model retraining. Implementing A/B testing of models and designing APIs for model inferencing at scale will be crucial. You should have proven expertise with MLflow, SageMaker, Vertex AI, and Azure AI. As an AI Platform Engineer, you will serve as a subject matter expert in LLM serving paradigms, with in-depth knowledge of GPU architectures. Expertise in distributed training and serving of large language models, along with proficiency in model and data parallel training using frameworks like DeepSpeed and service frameworks like vLLM, will be required. Demonstrating proven expertise in model fine-tuning and optimization techniques to achieve better latencies and accuracies in model results will be part of your responsibilities. Reducing training and resource requirements for fine-tuning LLM and LVM models will also be essential. Having extensive knowledge of different LLM models and providing insights on their applicability based on use cases is crucial. You should have proven experience in delivering end-to-end solutions from engineering to production for specific customer use cases. Your proficiency in DevOps and LLMOps practices, along with knowledge of Kubernetes, Docker, and container orchestration, will be necessary. A deep understanding of LLM orchestration frameworks such as Flowise, Langflow, and Langgraph is also required. In terms of skills, you should be familiar with LLM models like Hugging Face OSS LLMs, GPT, Gemini, Claude, Mixtral, and Llama, as well as LLM Ops tools like ML Flow, Langchain, Langraph, LangFlow, Flowise, LLamaIndex, SageMaker, AWS Bedrock, Vertex AI, and Azure AI. Additionally, knowledge of databases/data warehouse systems like DynamoDB, Cosmos, MongoDB, RDS, MySQL, PostGreSQL, Aurora, and Google BigQuery, as well as cloud platforms such as AWS, Azure, and GCP, is essential. Proficiency in DevOps tools like Kubernetes, Docker, FluentD, Kibana, Grafana, and Prometheus, along with cloud certifications like AWS Professional Solution Architect and Azure Solutions Architect Expert, will be beneficial. Strong programming skills in Python, SQL, and Javascript are required for this full-time role, with an in-person work location.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Data Engineer at GreyOrange, you will be responsible for designing, developing, and maintaining ETL pipelines to ensure efficient data flow for high-scale data processes. Your primary focus will be on managing and optimizing data storage and retrieval in Google BigQuery, while ensuring performance efficiency and cost-effectiveness. Additionally, you will be setting up quick analytics dashboards using tools like Metabase, Looker, or any other preferred platform. Collaboration with internal analysts and stakeholders, including customers, is key in understanding data needs and implementing robust solutions. You will play a crucial role in monitoring, troubleshooting, and resolving data pipeline issues to maintain data integrity and availability. Implementing data quality checks and maintaining data governance standards across the ETL processes will also be part of your responsibilities. Automation will be a significant aspect of your role, where you will develop scripts for repetitive tasks and optimize manual processes. Documenting the entire analytics implementation and data structures will ensure a guide for all users. Staying updated with industry best practices and emerging technologies in data engineering and cloud-based data management is essential. In addition to the responsibilities, the following requirements are necessary for this role: - 4+ years of experience as a Data Engineer or in a similar role - Strong experience with ETL tools and frameworks such as Apache Airflow, Dataflow, Estuary - Proficiency in SQL and extensive experience with Google BigQuery - Setting up analytics dashboards using tools like Looker, Metabase - Knowledge of data warehousing concepts and best practices - Experience with cloud platforms, particularly Google Cloud Platform (GCP) - Strong analytical and problem-solving skills focusing on cost optimization in cloud environments - Familiarity with Python or other scripting languages for automation and data processing - Excellent communication skills and the ability to work collaboratively in a team environment - Experience with data modeling and schema design As a Data Engineer at GreyOrange, you will have the opportunity to provide guidance and mentorship to junior data engineers and data analysts when needed.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Database Developer and Designer, you will be responsible for building and maintaining Customer Data Platforms (CDP) databases to ensure performance and stability. Your role will involve optimizing SQL queries to improve performance, creating visual data models, and administering database security. Troubleshooting and debugging SQL code issues will be a crucial part of your responsibilities. You will be involved in data integration tasks, importing and exporting events, user profiles, and audience changes to Google BigQuery. Utilizing BigQuery for querying, reporting, and data visualization will be essential. Managing user and service account authorizations, as well as integrating Lytics with BigQuery and other data platforms, will also be part of your duties. Handling data export and import between Lytics and BigQuery, configuring authorizations for data access, and utilizing data from various source systems to integrate with CDP data models are key aspects of the role. Preferred candidates will have experience with Lytics CDP and CDP certification. Hands-on experience with at least one Customer Data Platform technology and a solid understanding of the Digital Marketing Eco-system are required. Your skills should include proficiency in SQL and database management, strong analytical and problem-solving abilities, experience with data modeling and database design, and the capability to optimize and troubleshoot SQL queries. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, familiarity with Google Cloud Platform services, and a strong grasp of data security and access management are essential. You should also be proficient in Lytics and its integration capabilities, have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication and project management skills, and the ability to learn new CDP technologies and deliver in a fast-paced environment. Ultimately, your role is crucial for efficient data management and enabling informed decision-making through optimized database design and integration.,

Posted 3 days ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced professional with 9.5 to 13 years of experience, you will be responsible for designing and developing robust database solutions to support business applications. Your expertise in Postgres development is mandatory, along with optimizing SQL queries for high performance and efficiency. Collaborating with cross-functional teams to gather and analyze requirements will be a key aspect of your role. Ensuring database security measures are implemented and maintained to protect sensitive data is crucial. You will also be required to conduct regular database performance tuning and troubleshooting, as well as develop and maintain documentation for database systems and processes. Providing technical support and guidance to junior developers is part of the responsibilities. Maintaining data integrity and consistency across all database systems, performing data migration and transformation tasks, and monitoring database systems for optimal performance and availability are essential tasks. Developing and implementing backup and recovery strategies, staying updated with industry trends, and best practices in database management are also expected. Participating in code reviews, contributing to continuous improvement initiatives, and possessing a strong understanding of database architecture and design principles are necessary qualifications. Proficiency in SQL and ANSI SQL, experience in database performance tuning and optimization, ability to troubleshoot and resolve complex database issues, excellent problem-solving and analytical skills, as well as strong communication and collaboration abilities are required for this role.,

Posted 3 days ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Senior Data Analyst - Project Management Location: Bengaluru, Karnataka, India Experience : 2-3 Years About the Company & Role : We are one of Indias premier integrated political consulting firms specializing in building data-driven 360-degree election campaigns. We help our clients with strategic advice and implementation which brings together data-backed insights and in-depth ground intelligence into a holistic electoral campaign. We are passionate about our democracy and the politics that shape the world around us. We draw on some of the sharpest minds from distinguished institutions and diverse professional backgrounds to help us achieve our goal. The team brings in 7 years of experience in building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country. Job Summary: We are seeking a highly motivated and skilled Data Analyst to join our dynamic Project Management Office (PMO). This critical role involves developing, maintaining, and enhancing insightful PMO dashboards while also designing, implementing, and managing automated data pipelines. The ideal candidate will possess a strong blend of data analysis, visualization, and technical automation skills to ensure the PMO has timely, accurate data for tracking project performance, identifying trends, and making data-driven decisions. Key Responsibilities: PMO Dashboard Development & Management: Design, build, and maintain interactive dashboards using BI tools (e.g., Looker Studio, Tableau) to visualize key project metrics, resource allocation, timelines, risks, and overall PMO performance KPIs. Collaborate with PMO leadership and project managers to gather reporting requirements and translate them into effective data models and visualizations. Ensure data accuracy, consistency, and reliability within dashboards and reports. Perform data analysis to identify trends, potential issues, and areas for process improvement within project execution. Generate regular performance reports and support ad-hoc data requests from stakeholders. Data Management: Design, develop, implement, and maintain robust, automated data pipelines for Extract, Transform, Load (ETL/ELT) processes. Automate data collection from various sources including project management software, spreadsheets, databases, and APIs (e.g., Slack API). Load and process data efficiently into our data warehouse environment (e.g., Google BigQuery). Write and optimize SQL queries for data manipulation, transformation, and aggregation. Implement data quality checks, error handling, and monitoring for automated pipelines. Troubleshoot and resolve issues related to data extraction, transformation, loading, and pipeline failures. Document data sources, data models, pipeline architecture, and automation workflows. Required Qualifications & Skills: Bachelor&aposs degree in Computer Science, Data Science, Statistics, Information Systems, Engineering, or a related quantitative field. Proven experience (approx. 2-3 years) in data analysis, business intelligence, data engineering, or a similar role. Strong proficiency in SQL for complex querying, data manipulation, and performance tuning. Hands-on experience building and maintaining dashboards using Tableau. Demonstrable experience in designing and automating data pipelines using scripting languages (Python preferred) and/or ETL/ELT tools. Solid understanding of data warehousing concepts, ETL/ELT principles, and data modeling. Excellent analytical, problem-solving, and critical thinking skills. Strong attention to detail and commitment to data accuracy. Good communication and collaboration skills, with the ability to interact with technical and non-technical stakeholders. Ability to work independently and manage priorities effectively. Preferred Qualifications & Skills: Experience working directly within a Project Management Office (PMO) or supporting project management functions. Familiarity with project management tools (e.g., Jira, Asana, MS Project) and concepts (Agile, Waterfall). Experience with cloud platforms, particularly Google Cloud Platform (GCP) and BigQuery. Experience with workflow orchestration tools (e.g., Airflow, Cloud Composer, Cloud Functions). Experience integrating data via APIs from various business systems. Basic understanding of data governance and data quality management practices. If you are a driven professional seeking a high-impact challenge and interested in joining a team of like-minded, motivated individuals who think strategically, act decisively, and get things done, email us at [HIDDEN TEXT] Show more Show less

Posted 3 days ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. What We Offer Fully remote internship with flexible working hours. Hands-on experience with real-world database projects. Mentorship from experienced database professionals. Certificate of completion and potential for a full-time opportunity based on performance. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don&apost hesitate to apply today. We are waiting for you! Show more Show less

Posted 3 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a skilled and experienced DBT (Data Build Tool) / DataStage Developer to design, develop, and maintain robust data transformation and integration solutions. The ideal candidate will have hands-on expertise with either DBT for data modeling in data warehouses or IBM DataStage for ETL processes, coupled with a strong understanding of data warehousing concepts. This role is crucial for building and optimizing our data pipelines, ensuring data quality, and supporting analytical initiatives. Roles and Responsibilities: Design, develop, and implement data transformation logic using either DBT (Data Build Tool) for data modeling within data warehouses (e.g., Snowflake, BigQuery, Redshift) or IBM DataStage for complex ETL (Extract, Transform, Load) processes. Create, maintain, and optimize data pipelines to ensure efficient and reliable data flow from source systems to target data platforms. Write, test, and deploy SQL-based data transformations using DBT, ensuring data integrity and performance. For DataStage, develop and fine-tune DataStage jobs, sequences, and routines for data extraction, transformation, and loading. Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions. Perform data profiling, data quality checks, and validation to ensure accuracy and consistency of data. Troubleshoot and resolve data pipeline issues , performance bottlenecks, and data discrepancies. Implement and maintain version control for data models and ETL jobs. Contribute to the development of data governance and data quality standards . Create and maintain comprehensive technical documentation for data models, ETL processes, and data flows. Participate in code reviews and provide constructive feedback to peers. Required Skills and Qualifications: Proven experience as a Data Engineer / Developer with a focus on data transformation. Strong expertise in either DBT (Data Build Tool) for data modeling and transformation in modern data warehouses OR extensive experience with IBM DataStage for enterprise-level ETL. Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT methodologies . Highly proficient in SQL (Structured Query Language) for complex data manipulation and querying. Experience with cloud data platforms like Snowflake, Google BigQuery, or Amazon Redshift (if DBT focused). Familiarity with version control systems (e.g., Git). Strong analytical and problem-solving skills with attention to detail. Excellent communication skills (written and verbal) and ability to work collaboratively in a team environment. Knowledge of scripting languages (e.g., Python) is a plus.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Junior Data Analyst at our company located in Hyderabad, you will play a crucial role in our Data & Analytics team. We are looking for a motivated and intellectually curious individual with a solid foundation in data analysis, business intelligence, and critical thinking. Your responsibilities will include interpreting data, generating insights, and supporting strategic initiatives across various business units. You will collaborate with internal stakeholders to understand business requirements, work with large datasets, build dashboards, and deliver actionable insights that drive informed business decisions. Your key responsibilities will involve data extraction & transformation, reporting & dashboarding, data analysis, stakeholder collaboration, data quality & governance, as well as communication & documentation. To excel in this role, you must possess a Bachelor's degree in Computer Science, Mathematics, Statistics, Economics, Engineering, or a related field. You should have 2-3 years of hands-on experience in data analytics or business intelligence roles. Strong analytical thinking, proficiency in SQL, experience with ETL processes, and knowledge of Excel and data visualization tools like Tableau or Power BI are essential. Excellent communication skills, attention to detail, and the ability to manage multiple priorities and deadlines are also required. Preferred or bonus skills include exposure to scripting languages like Python or R, experience with cloud platforms and tools (e.g., AWS, Snowflake, Google BigQuery), prior experience in the financial services or fintech domain, and an understanding of data modeling and warehousing concepts. In return, we offer a collaborative, inclusive, and intellectually stimulating work environment, opportunities for learning and growth through hands-on projects and mentorship, and the chance to work with data that drives real business impact.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

At TRKKN, we are dedicated to driving our clients" data-driven businesses to new heights. As the leading European Google partner and reseller of Google Cloud & Google Marketing Platform, we serve as a catalyst for cutting-edge digital marketing solutions. Our team is diverse, international, and operates in a fast-growing digital & technological environment, providing an ideal and dynamic work setting for individuals with a hunger for growth, responsibility, and shaping their future. With Google in close proximity, the learning and development opportunities at TRKKN are limitless. As a Junior Consultant Digital Analytics at TRKKN, you will play a crucial role in enabling our clients to maximize the potential of their data and achieve their digital marketing objectives. You will lead the implementation, configuration, and auditing of Google Analytics 4 (GA4) and Google Tag Manager (GTM) setups, focusing on large e-commerce and hospitality clients. Your responsibilities will include designing and executing comprehensive measurement strategies, collaborating with clients" teams to define KPIs, developing data visualization dashboards, troubleshooting tracking issues, performing advanced analysis, and potentially contributing to server-side GTM implementations and data integration projects. The ideal candidate for this role holds a Bachelor's degree in a related field, has a minimum of 2 years of hands-on experience with Google Analytics and Google Tag Manager, and is experienced in working with large-scale e-commerce and hospitality clients. Strong understanding of web analytics principles, data layers, digital marketing concepts, proficiency in debugging and troubleshooting using developer tools, familiarity with data visualization tools, exceptional communication skills, and the ability to manage multiple projects in a fast-paced consulting environment are key qualifications we are looking for. At TRKKN, we empower you to take responsibility from the start, offering enormous growth potential within the company. We value your contribution to our success and provide competitive salary, knowledge sharing, personal development plans, leadership training, and a hybrid work model. If you identify with our DNA and are looking to grow with us, we invite you to connect with us for a personal chat.,

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Pune, Maharashtra, India

On-site

Job Description Project Role : Data Engineer Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills :Google BigQuery Good to have skills :NA Minimum7.5year(s) of experience is required Educational Qualification :15 years full time education Job Summary: As a Data Engineer , you will be responsible for designing, developing, and maintaining data solutions to support data generation, collection, and processing. This role involves creating data pipelines, ensuring data quality, and implementing ETL (Extract, Transform, Load) processes to migrate and deploy data across systems. Your day will primarily revolve around working on data solutions, optimizing data pipelines, and ensuring data compliance and quality. Roles & Responsibilities: Subject Matter Expertise (SME) : Act as an SME in the field of data engineering, leveraging your expertise to guide the team and make informed decisions. Team Collaboration & Management : Collaborate with and manage the team to ensure optimal performance and meet project goals. Take responsibility for team decisions and engage with multiple teams, contributing to key decisions impacting the project and the organization. Data Architecture & Design : Lead the data architecture design and implementation for various data solutions. Work on data modeling and design of databases, ensuring systems are efficient, scalable, and meet business needs. Pipeline Optimization : Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data flow across systems. Data Quality & Compliance : Conduct data quality assessments and ensure the data meets quality standards. Ensure data compliance with regulatory standards and organizational policies. ETL Implementation : Lead the design and implementation of ETL processes for migrating and integrating data between various systems. Professional & Technical Skills: Must Have Skills : Proficiency in Google BigQuery : Expertise in Google BigQuery for large-scale data processing and querying. Data Modeling & Database Design : Strong understanding of data modeling concepts and database design principles. Cloud-Based Data Solutions : Experience working with cloud-based data solutions , especially those integrated with BigQuery. SQL & Scripting Languages : Proficiency in SQL and scripting languages used for data processing and transformation. Data Warehousing : Hands-on experience with data warehousing concepts , ensuring data is properly stored, organized, and easily accessible. Additional Information: Experience : A minimum of 7.5 years of experience working with Google BigQuery and related data technologies. Location : This position is based at our Pune office . Education : A 15 years full-time education is required.

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Pune, Maharashtra, India

On-site

Job Description Project Role :Data Platform Engineer Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills :Databricks Unified Data Analytics Platform Good to have skills :NA Minimum5year(s) of experience is required Educational Qualification :15 years full time education Job summary he Data Platform Engineer role described here falls under the Data Engineering or Data Architecture function, with a focus on data platform design, integration, and the application of machine learning algorithms. Below is a detailed overview: Key Responsibilities: Assist with Data Platform Blueprint and Design : Collaborate with Integration Architects and Data Architects to create a cohesive design and blueprint for the data platform, ensuring seamless integration between systems and data models. Collaborate with Cross-functional Teams : Work closely with teams to make key decisions and provide solutions to challenges across teams, ensuring alignment and smooth execution of the project. Develop and Maintain Data Platform Components : Contribute to the development and maintenance of key data platform components that enable efficient data processing, storage, and analytics. Drive the Success of the Project : Play an active role in ensuring the project's success by applying expertise in platform design, integration, and data management. Professional & Technical Skills: Must To Have Skills : Proficiency in Databricks Unified Data Analytics Platform : A deep understanding of Databricks, particularly in how it integrates with big data platforms and analytics workflows. Statistical Analysis and Machine Learning Algorithms : Strong foundation in implementing and applying machine learning models like linear regression, logistic regression, decision trees, and clustering algorithms. Data Visualization : Experience with tools like Tableau or Power BI for presenting data insights and analytics visually. Data Munging and Cleaning : Hands-on experience in cleaning, transforming, and normalizing data to ensure it is ready for analysis and reporting. Additional Information: Experience : Minimum of 5 years of experience in working with the Databricks Unified Data Analytics Platform . Location : This role is based in Pune . Education : A 15 years full-time education (typically equivalent to a bachelor's degree) is required.

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Indore, Madhya Pradesh, India

On-site

Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Data bricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Chennai, Tamil Nadu, India

On-site

Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects. Ensure cohesive integration between systems and data models. Implement data platform components. Troubleshoot and resolve data platform issues. Professional & Technical Skills: Must To Have Skills: Proficiency in Data bricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role Description: Design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform, and load) processes to migrate and deploy data across systems. Must Have Skills: Google BigQuery Good to Have Skills: Microsoft SQL Server Google Cloud Data Services Job Requirements: Summary: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities: Expected to perform independently and become a subject matter expert (SME). Actively participate in team discussions and contribute to problem-solving. Develop and maintain data pipelines. Ensure data quality throughout the data lifecycle. Implement ETL processes for data migration and deployment. Collaborate with cross-functional teams to understand data requirements. Optimize data storage and retrieval processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Google BigQuery. Strong understanding of data engineering principles. Experience with cloud-based data services. Knowledge of SQL and database management systems. Hands-on experience with data modeling and schema design. Additional Information: The candidate should have a minimum of 3 years of experience in Google BigQuery. A 15 years full-time education is required.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

At TRKKN, we are dedicated to driving our client's data-driven businesses to new heights. As the leading European Google partner and reseller of Google Cloud & Google Marketing Platform, we serve as a catalyst for cutting-edge digital marketing solutions. Our team is diverse, international, and operates in a fast-growing digital & technological environment, providing an ideal work setting for individuals with a hunger for growth and responsibility. The close proximity to Google offers limitless learning and development opportunities at TRKKN. As a Junior Consultant Digital Analytics at TRKKN, you will play a crucial role in helping clients achieve digital maturity. You will lead the implementation, configuration, and auditing of Google Analytics 4 (GA4) and Google Tag Manager (GTM) setups for various client portfolios, focusing on large e-commerce and hospitality clients. Your responsibilities will include designing and executing comprehensive measurement strategies, collaborating with clients" teams to define KPIs, developing data visualization dashboards, troubleshooting tracking issues, performing advanced analysis, and potentially contributing to server-side GTM implementations and data integration projects. We are looking for individuals who hold a Bachelor's degree in a related field, have a minimum of 2 years of hands-on experience in implementing and managing Google Analytics and Google Tag Manager, and are experienced in working with large-scale e-commerce and hospitality clients. Strong understanding of web analytics principles, proficiency in debugging and troubleshooting using developer tools, familiarity with data visualization tools like Looker Studio and BigQuery, exceptional communication skills, and the ability to manage multiple projects in a fast-paced consulting environment are essential qualities we seek in candidates. At TRKKN, we offer a competitive salary, knowledge sharing opportunities, company events, personal development plans, leadership training, and a hybrid work model. We empower you to take responsibility and provide an environment for tremendous growth potential within the company. If you identify with our values and want to be a part of our success story, we look forward to having a personal chat with you! :),

Posted 1 week ago

Apply

1.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an Associate Manager - Data IntegrationOps, you will play a crucial role in supporting and managing data integration and operations programs within our data organization. Your responsibilities will involve maintaining and optimizing data integration workflows, ensuring data reliability, and supporting operational excellence. To succeed in this position, you will need a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Your primary duties will include assisting in the management of Data IntegrationOps programs, aligning them with business objectives, data governance standards, and enterprise data strategies. You will also be involved in monitoring and enhancing data integration platforms through real-time monitoring, automated alerting, and self-healing capabilities to improve uptime and system performance. Additionally, you will help develop and enforce data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Collaboration with cross-functional teams will be essential to optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. You will also contribute to promoting a data-first culture by aligning with PepsiCo's Data & Analytics program and supporting global data engineering efforts across sectors. Continuous improvement initiatives will be part of your responsibilities to enhance the reliability, scalability, and efficiency of data integration processes. Furthermore, you will be involved in supporting data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Developing API-driven data integration solutions using REST APIs and Kafka, deploying and managing cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, and participating in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins will also be part of your role. Your qualifications should include at least 9 years of technology work experience in a large-scale, global organization, preferably in the CPG (Consumer Packaged Goods) industry. You should also have 4+ years of experience in Data Integration, Data Operations, and Analytics, as well as experience working in cross-functional IT organizations. Leadership/management experience supporting technical teams and hands-on experience in monitoring and supporting SAP BW processes are also required qualifications for this role. In summary, as an Associate Manager - Data IntegrationOps, you will be responsible for supporting and managing data integration and operations programs, collaborating with cross-functional teams, and ensuring the efficiency and reliability of data integration processes. Your expertise in enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support will be key to your success in this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a seasoned Data Engineer with expertise in SQL, Python, and AWS, and you are responsible for designing and managing data solutions. Your role involves understanding and analyzing client business requirements, recommending modern data tools, developing and maintaining data pipelines and ETL processes, creating and optimizing data models for client reporting and analytics, ensuring seamless data integration and visualization, communicating with clients for updates and issue resolution, and staying updated on industry best practices and emerging technologies. You should have 3-5 years of experience in data engineering/analytics, proficiency in SQL and Python for data manipulation and analysis, knowledge of Pyspark, experience with data warehouse platforms like Redshift and Google BigQuery, familiarity with AWS services like S3, Glue, Athena, proficiency in Airflow, familiarity with event tracking platforms like GA or Amplitude, strong problem-solving skills, adaptability, excellent communication skills, proactive client engagement, and the ability to collaborate effectively with team members and clients. In return, you will receive a competitive salary with a bonus, employee discounts across all brands, medical and health insurance, a collaborative work environment, and a good vibes work culture.,

Posted 1 week ago

Apply

6.0 - 11.0 years

16 - 27 Lacs

Chennai

Hybrid

Role & responsibilities bachelor's or Masters degree in Information Systems, Computer Science, Business, or a related field. 6+ years of experience in business systems architecture, enterprise applications management, or IT consulting. Technical Expertise: Deep understanding of ERP, CRM, CLM, HRIS, and financial systems (e.g., Salesforce, NetSuite, Oracle, Bob, Ironclad). Strong hands-on experience with data architecture and analytics platforms, including: - Google BigQuery (data modeling, table design) - dbt for data transformation - Fivetran or similar ETL/ELT tools for ingestion - Tableau (or equivalent BI tools) for building interactive dashboards and reports Experience with API integration, Workato, and automation tools. Proficiency in cloud platforms (AWS, Azure, or GCP) and SaaS solutions. Business Acumen & Leadership: Strong analytical skills and ability to translate business needs into technical solutions. Proven ability to lead cross-functional teams and drive digital transformation initiatives. Communication & Problem-Solving: The business system architect interviews are moving along slowly and candidates are not meeting some of the Technical expertise requirements especially around having deep understanding of CRMs, ERPs, CLMs, iPaaS and overall internal systems (it seems typically limited to one system - eg. ERP). Same for overall enterprise application management experience, most dont have system mgt experience but rather system implementation (which is ok, but still seems limited for the most part to a single system).

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Business Intelligence (BI) Analyst at SolarWinds, you will play a pivotal role in driving data-driven decision-making throughout the organization. Your strategic mindset and expertise in BI tools, data visualization, and advanced analytics will be crucial in transforming raw data into actionable insights that enhance business performance and operational efficiency. Your responsibilities will include developing, maintaining, and optimizing BI dashboards and reports to support business decision-making. You will extract, analyze, and interpret complex datasets from multiple sources to identify trends and opportunities. Collaborating with cross-functional teams, you will define business intelligence requirements and deliver insightful solutions. Presenting key findings and recommendations to senior leadership and stakeholders will be a key aspect of your role. Ensuring data accuracy, consistency, and governance by implementing best practices in data management will be essential. You will also conduct advanced analytics to drive strategic initiatives and mentor junior BI analysts to enhance the overall team capability. To excel in this role, you should hold a Bachelor's degree in Business Analytics, Computer Science, or a related field, along with at least 5 years of experience in business intelligence, data analysis, or a similar role. Proficiency in BI tools such as Tableau, Power BI, and SQL for querying and data manipulation is required. Experience with ETL processes, data warehousing, and database management is important, with expertise in Tableau preferred. An understanding of Google BigQuery and experience with cloud platforms like AWS and Azure will be beneficial. If you are a collaborative, accountable, and empathetic individual who thrives in a fast-paced environment and believes in the power of teamwork to drive lasting growth, then SolarWinds is the place for you. Join us in our mission to accelerate business transformation with simple, powerful, and secure solutions. Grow your career with us and make a meaningful impact in a people-first company. Please note that all applications are treated in accordance with the SolarWinds Privacy Notice.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an ETL Data Engineer with 6-9 years of experience, you will be responsible for collaborating with analysts and data architects to develop and test ETL pipelines using SQL and Python in Google BigQuery. Your primary role will involve performing related data quality checks, implementing validation frameworks, and optimizing BigQuery queries for performance and cost-efficiency. It is essential that you are available for full-time engagement and possess the following technical requirements: - SQL (Advanced level): You should have a strong command of complex SQL logic, including window functions, CTEs, pivot/unpivot, and proficiency in stored procedure/SQL script development. Experience in writing maintainable SQL for transformations is crucial. - Python for ETL: Your expertise should extend to writing modular and reusable ETL logic using Python, along with familiarity with JSON manipulation and API consumption. - Google BigQuery: Hands-on experience in developing within the BigQuery environment is required. You should have an understanding of partitioning, clustering, and performance tuning within BigQuery. - ETL Pipeline Development: You must be experienced in developing ETL/ELT pipelines, data profiling, validation, quality/health check, error handling, logging, and notifications. Nice-to-have skills include experience with the Google BigQuery platform and knowledge of CI/CD practices for data workflows. This role is full-time and permanent, offering benefits such as cell phone reimbursement, health insurance, paid time off, and provident fund. The work schedule is during the day shift, and the work location is in person in Noida, Pune, or Chennai.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Analyst with 1+ years of experience in AdTech, you will be an integral part of our analytics team. Your primary role will involve analyzing large-scale advertising and digital media datasets to support business decisions. You will work with various AdTech data such as ads.txt, programmatic delivery, campaign performance, and revenue metrics. Your responsibilities will include designing, developing, and maintaining scalable data pipelines using GCP-native tools like Cloud Functions, Dataflow, and Composer. You will be required to write and optimize complex SQL queries in BigQuery for data extraction and transformation. Additionally, you will build and maintain dashboards and reports in Looker Studio to visualize key performance indicators (KPIs) and campaign performance. Collaboration with cross-functional teams including engineering, operations, product, and client teams will be crucial as you gather requirements and deliver analytics solutions. Monitoring data integrity, identifying anomalies, and working on data quality improvements will also be a part of your role. To be successful in this role, you should have a minimum of 1 year of experience in a data analytics or business intelligence role. Hands-on experience with AdTech datasets, strong proficiency in SQL (especially with Google BigQuery), and experience with building data pipelines using Google Cloud Platform (GCP) tools are essential. Proficiency in Looker Studio, problem-solving skills, attention to detail, and excellent communication skills are also required. Preferred qualifications include experience with additional visualization tools such as Tableau, Power BI, or Looker (BI), exposure to data orchestration tools like Apache Airflow (via Cloud Composer), familiarity with Python for scripting or automation, and understanding of cloud data architecture and AdTech integrations (e.g., DV360, Ad Manager, Google Ads).,

Posted 2 weeks ago

Apply

1.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for building and maintaining scalable ETL/ELT data pipelines using Python and cloud-native tools. You will design and optimize data models and queries on Google BigQuery for analytical workloads. Additionally, you will develop, schedule, and monitor workflows using orchestration tools like Apache Airflow or Cloud Composer. Your role will involve ingesting and integrating data from multiple structured and semi-structured sources such as MySQL, MongoDB, APIs, and cloud storage. It will be essential to ensure data integrity, security, and quality through validation, logging, and monitoring systems. Collaboration with analysts and data consumers to understand requirements and deliver clean, usable datasets will also be a key aspect of your responsibilities. Moreover, you will implement data governance, lineage tracking, and documentation as part of platform hygiene. You should possess 1 to 7 years of experience in data engineering or backend development. Strong experience with Google BigQuery and GCP (Google Cloud Platform) is a must-have skill for this role. Proficiency in Python for scripting, automation, and data manipulation is essential. A solid understanding of SQL and experience with relational databases like MySQL is required. Experience working with MongoDB and semi-structured data (e.g., JSON, nested formats), exposure to data warehousing, data modeling, and performance tuning, as well as familiarity with Git-based version control and CI/CD practices are also preferred skills for this position.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

You will be responsible for leading and managing the delivery of projects as well as achieving project and team goals. Your tasks will include building and supporting data ingestion and processing pipelines, designing and maintaining machine learning infrastructure, and leading client engagement on technical projects. You will define project scopes, track progress, and allocate work to the team. It will be essential to stay updated on big data technologies and conduct pilots to design scalable data architecture. Collaboration with software engineering teams to drive multi-functional projects to completion will also be a key aspect of your role. To excel in this position, we expect you to have a minimum of 6 years of experience in data engineering with at least 2 years in a leadership role. Experience working with global teams and remote clients is required. Hands-on experience in building data pipelines across various infrastructures, knowledge of statistical and machine learning techniques, and the ability to integrate machine learning into data pipelines are essential. Proficiency in advanced SQL, data warehousing concepts, and DataMart designing is necessary. Strong familiarity with modern data platform components like Spark and Python, as well as experience with Data Warehouses (e.g., Google BigQuery, Redshift, Snowflake) and Data Lakes (e.g., GCS, AWS S3) is expected. Experience in setting up and maintaining data pipelines with AWS Glue, Azure Data Factory, and Google Dataflow, along with relational SQL and NoSQL databases, is also required. Excellent problem-solving and communication skills are essential for this role.,

Posted 2 weeks ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies