Home
Jobs

732 Bigquery Jobs - Page 24

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 6.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Tableau Position Mandatory Skills : Strong Tableau Dashboard Creation and SQL skills Desired Skills : Worked on Datawarehouse like BigQuery, Ad Tech Domain knowledge Work Location : Pune Experience : 5-6 years of Tableau Dashboard creation Job Responsibilities : Understand requirements for Dashboards and creating effective dashboards in Tableau as per clients UX standards. Strong in SQL and Datawarehouse concepts. Experience in creating extracts and worked on Tableau server. Data Understanding: Collaborate with business stakeholders to understand their data requirements and reporting needs. Analyze complex datasets to identify key insights and trends. Data Preparation: Cleanse, transform, and prepare data for visualization. Use SQL queries to extract and manipulate data from various sources, such as databases, data warehouses, and cloud platforms. Dashboard Creation: Design and develop visually appealing and interactive dashboards using Tableau's drag-and-drop interface. Create custom calculations, parameters, and filters to provide dynamic insights. Format dashboards to adhere to client branding and UX standards. Data Storytelling: Present data in a clear and compelling manner, using charts, graphs, and other visualization techniques. Tailor dashboards to specific audiences, adjusting the level of detail and complexity. Performance Optimization: Optimize dashboard performance to ensure fast loading times and smooth interactions. Implement best practices for data extraction, calculation, and visualization. Collaboration: Work closely with data analysts, data engineers, and business users to deliver timely and accurate insights. Provide technical support and training to end-users.

Posted 1 month ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Mandatory skills are 1)SQL (Expert) 2)Datawarhouse / ETL Background. 3)DBT (Intermediate). Should be able to learn quickly. Desirable skills are 1)BigQuery 2)Github Work Location : Pune Employment Type: Contract (6 months or more) Notice Period: Immediate to 15 days Experience : 3-4 years relevant exp. Roles and Responsibilities Understanding client requirements and creating models using DBT on the BigQuery on GCP Development of new functionality and maintenance of existing functionality

Posted 1 month ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Noida

Hybrid

Naukri logo

Data Engineer (SaaS-Based) || 5-7 years || NOIDA || 3 pm-12 AM IST shift Location: Noida (In-office/Hybrid; Client site if required) Experience: 5-7 years Type: Full-Time | Immediate Joiners Preferred Shift: 3 PM to 12 AM IST Client: Leading Canadian-based Tech Company Good to have: GCP Certified Data Engineer Overview of the role: As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. Extensive experience in doing requirement discovery, analysis and data pipeline solution design. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. Work closely with analysts and business process owners to translate business requirements into technical solutions. Coding experience in scripting and languages (Python, SQL, PySpark). Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). Exposure of Google Dataproc and Dataflow. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred) Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. Job Type: Full-time

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Were Hiring: Senior GCP Data Engineer (7+ Years Experience) Location: Hyderabad (Work from Office - Mandatory) Apply Now: sasidhar.m@technogenindia.com Are you a passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and strong hands-on experience in data migration projects ? Do you bring solid knowledge of Oracle to the table and thrive in a fast-paced, collaborative environment? TechnoGen India is looking for a Senior GCP Data Engineer to join our Hyderabad team. This is a full-time, on-site opportunity designed for professionals ready to take on challenging migration projects and deliver impactful solutions. What We’re Looking For: 7+ years of experience in Data Engineering Strong expertise in GCP (BigQuery, Dataflow, Pub/Sub, etc.) Proven experience in complex GCP migration projects Solid Oracle background (data extraction, transformation, and optimization) Ability to work full-time from our Hyderabad office If you’re ready to bring your skills to a growing team that values innovation and excellence, we want to hear from you ! Best Regards, Sasidhar M | Sr IT Recruiter sasidhar.m@technogenindia.com www.technogenindia.com |

Posted 1 month ago

Apply

5.0 - 9.0 years

8 - 18 Lacs

Bengaluru

Remote

Naukri logo

Company - Forbes Advisory Location - Remote Database Engineer No of years of exp: 5+ Years Np - We prefer Immediate joiners or those who should be less than 60 days. Major skill set: Python, Sql, OOPS, AWS RDS and Google BigQuery

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are looking for a Data Quality Engineer who will safeguard the integrity of our cloud-native data assets. You will design and execute automated and manual data-quality checks across structured and semi-structured sources on Azure and GCP, validating that our data pipelines deliver accurate, complete, and consistent datasets for analytics, reporting, and AI initiatives. Key Responsibilities Define, build, and maintain data-quality frameworks that measure accuracy, completeness, timeliness, consistency, and validity of data ingested through ETL/ELT pipelines. Develop automated tests using SQL, Python, or similar tools; supplement with targeted manual validation where required. Collaborate with data engineers to embed data-quality gates into CI/CD pipelines on Azure Data Factory / Synapse / Fabric and GCP Dataflow / Cloud Composer. Profile new data sources (structured and semi-structuredJSON, Parquet, Avro) to establish baselines, detect anomalies, and recommend cleansing or transformation rules. Monitor data-quality KPIs and publish dashboards/alerts that surface issues to stakeholders in near-real time. Conduct root-cause analysis for data-quality defects, propose remediation strategies, and track resolution to closure. Maintain comprehensive documentation of test cases, data-quality rules, lineage, and issue logs for audit and governance purposes. Partner with data governance, security, and compliance teams to ensure adherence to regulatory requirements Must-Have Skills 4-6 years of experience in data quality, data testing, or data engineering roles within cloud environments. Hands-on expertise with at least one major cloud data stackAzure (Data Factory, Synapse, Databricks/Fabric) or GCP (BigQuery, Dataflow, Cloud Composer). Strong SQL skills and proficiency in a scripting language such as Python for building automated validation routines. Solid understanding of data-modeling concepts (dimensional, 3NF, data vault) and how they impact data-quality rules. Experience testing semi-structured data formats (JSON, XML, Avro, Parquet) and streaming/near-real-time pipelines. Excellent analytical and communication skills; able to translate complex data issues into clear, actionable insights for technical and business stakeholders. Nice-to-Have Skills Familiarity with BI/reporting tools (Power BI, Looker, Tableau) for surfacing data-quality metrics. Preferred Certifications Google Professional Data Engineer or Associate Cloud Engineer (GCP track) - OR - Microsoft Certified: Azure Data Engineer Associate Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field. Comparable professional experience will also be considered. Why Join Us? You will be the guardian of our datas trustworthiness, enabling decision-makers to rely on insights with confidence. If you are passionate about building automated, scalable data-quality solutions in a modern cloud environment, we’d love to meet you.

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Required Skills: SQL, GCP (BigQuery, Composer, Data Flow), Big Data (Scala, Kafka) You'll need to have: Experience in Big Data technologies - GCP/Composer/Bigquery /DataFlow Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Experience with Data Warehouse concepts and Data Management life cycle.

Posted 1 month ago

Apply

13.0 - 17.0 years

22 - 30 Lacs

Pune

Hybrid

Naukri logo

Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 13 to 18 years Core competencies, knowledge and experience: Essential: - Strong SQL experience - Advanced level of SQL - Excellent data interpretation skills - Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques - Working knowledge of large information technology development projects using methodologies and standards - Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. - Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. - Ability to work under pressure to tight deadlines. - Hands on experience working with large datasets. - Able to manage different stakeholders. Good to Have / Alternate Skills: - Strong coding experience in Python. Experience: - In-depth working experience in ETL. C2 General - Fixing problems in cooperation with internal and external partners (e.g. Service owner, Tech. Support Team, IT-Ops) - Designing and implementing the changes to the existing different components of data flow. - Develop & maintain end to end data flow. - Maintaining the data quality, data consistency issues and essential bus business-criticaliness critical processes - Conducting preventative maintenance of the systems - Drive system optimization and simplification - Responsible for performance of data flow and optimisation of the data preparation in conjunction with the other technical team

Posted 1 month ago

Apply

5.0 - 7.0 years

12 - 13 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics. Location - Remote, Hyderabad,ahmedabad,pune,chennai,kolkata.

Posted 1 month ago

Apply

11.0 - 16.0 years

27 - 32 Lacs

Noida

Work from Office

Naukri logo

Responsibilities: - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc.. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI) - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team. Apply Save Save Pro Insights

Posted 1 month ago

Apply

5.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Role GCP Cloud Solutions Engineer (IoT & Development) As a GCP Cloud Solutions Engineer specializing in IoT and Development at Eximietas Design, you will be at the forefront of building and managing robust and scalable cloud infrastructure on the Google Cloud Platform. You will play a critical role in designing, deploying, and optimizing GCP services, with a particular focus on integrating and managing IoT devices and data, as well as supporting development workflows through automation and CI/CD pipelines. You will be responsible for ensuring the security, efficiency, and reliability of our cloud-based solutions. This role requires a strong understanding of GCP services, IoT principles, automation practices, and a passion for building innovative solutions. Key Responsibilities : - Provision, configure, and manage Google Cloud Platform resources, including Compute Engine VM instances, networking components, storage solutions, and security configurations. - Design and implement highly available and fault-tolerant GCP architectures. - Monitor and optimize the performance and cost-effectiveness of GCP resources. - Implement and manage security best practices within the GCP environment. - Design, deploy, and manage BigQuery data warehouses for large-scale data analysis. - Implement and manage Bigtable NoSQL databases for high-throughput, low-latency applications. - Optimize query performance and data storage within BigQuery and Bigtable. - Deploy, manage, and scale containerized applications using Google Kubernetes Engine (GKE). - Implement Kubernetes best practices for orchestration, scaling, and resilience. - Configure and manage networking, storage, and security within GKE clusters. - Design and implement solutions for integrating various IoT devices with the GCP infrastructure. - Utilize GCP IoT Core or other relevant services to manage device connectivity, security, and data ingestion. - Develop data processing pipelines to handle large volumes of IoT data for storage and analysis. - Ensure the security and integrity of IoT device data. - Design and implement CI/CD pipelines using tools like Cloud Build, Jenkins, GitLab CI/CD, or similar for automated application deployment and infrastructure provisioning. - Automate infrastructure provisioning and management tasks using Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. - Develop and maintain scripts for automation of routine operational tasks. - Implement comprehensive monitoring and logging solutions for GCP services and applications. - Proactively identify and troubleshoot performance bottlenecks and operational issues. - Participate in on-call rotations as needed to ensure system availability. - Collaborate effectively with development teams, data scientists, and other stakeholders to understand their requirements and provide cloud solutions. - Create and maintain clear and concise documentation for cloud infrastructure, configurations, and processes. - Stay up-to-date with the latest GCP services, features, and best practices, particularly in the areas of IoT and development. - Evaluate and recommend new technologies and approaches to improve our cloud infrastructure and processes. Skills & Qualifications : - Proven hands-on experience in deploying, managing, and optimizing core GCP services, including Compute Engine, VPC, Cloud Storage, IAM. - Deep understanding and practical experience with BigQuery for data warehousing and analysis. - Hands-on experience with Bigtable for NoSQL database solutions. - Strong experience in deploying, managing, and scaling applications using Kubernetes (GKE). - Experience in connecting and managing IoT devices with cloud platforms (preferably GCP IoT Core). - Understanding of IoT protocols (e.g., MQTT, CoAP) and data handling at scale. - Proficiency in implementing CI/CD pipelines using relevant tools (e.g., Cloud Build, Jenkins). - Strong experience with Infrastructure-as-Code (IaC) tools, preferably Terraform or Cloud Deployment Manager. - Scripting skills in languages such as Python, Bash, or Go for automation tasks. - Solid understanding of networking principles and GCP networking services (VPC, Firewall Rules, Load Balancing, Cloud DNS). - Knowledge of cloud security best practices and experience implementing security controls within GCP (IAM, Security Command Center). - Familiarity with Linux operating systems and command-line interface. - Excellent analytical and problem-solving skills with the ability to diagnose and resolve complex technical issues. - Strong written and verbal communication skills with the ability to effectively communicate technical concepts to both technical and non-technical audiences. - Ability to work effectively in a collaborative team environment. - Bachelors degree in Computer Science, Engineering, or a related field. - Minimum of 3-5 years of hands-on experience in managing and implementing solutions on the Google Cloud Platform. - GCP certifications (e.g., Google Cloud Certified - Professional Cloud Architect, Google Cloud Certified - Professional Cloud Engineer) are a significant plus. - Experience working in an Agile development environment is preferred. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

6.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Role Senior Data Analyst. Experience 6 to 10 years. Location Bangalore, Pune, Hyderabad, Gurgaon, Noida. Notice Immediate joiners only. About The Role Data Analyst EDA Exploratory Data Analysis, Communication ,Strong hands-on SQL ,Documentation Exp, GCP Exp, Data pipeline Exp. Requirements - 8+ years experience in Data mining working with large relational databases, succession using advanced data extraction and manipulation tools (for example; Big Query, Teradata, etc.) working with both structured and unstructured data. - Excellent communication skills, both written and verbal able to explain solutions, problems in clear and concise manner. - Experience in conducting business analysis to capture requirements from non-technical partners. - Superb analytical and conceptual thinking skills; to not only to manipulate but also derive relevant interpretations from data. - Proven knowledge of the data management lifecycle, including experience with data quality and metadata management. - Hands on experience in Computer Science, Statistics, Mathematics or Information Systems. - Experience in cloud, GCP Bigquery including but not limited to complex SQL querying. - 1-2 years or experience/exposure in the following : 1. Experience with CI/CD release processes using gitlab,Jira, confluence. 2. Familiarity with creating yaml files, understanding unstructured data such as json. 3. Experience with Looker Studio, Dataplex is a plus. - Hands on engineering experience is an asset. - Exposure to Python, Java nice to have. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary We are seeking a highly analytical and detail-oriented Data Specialist with deep expertise in SQL, Python, statistics, and automation. The ideal candidate will be responsible for designing robust data pipelines, analyzing large datasets, driving insights through statistical methods, and automating workflows to enhance data accessibility and business decision-making. Key Responsibilities - Write and optimize complex SQL queries for data extraction, transformation, and reporting. - Develop and maintain Python scripts for data analysis, ETL processes, and automation tasks. - Conduct statistical analysis to identify trends, anomalies, and actionable insights. - Build and manage automated dashboards and data pipelines using tools such as Airflow, Pandas, or Apache Spark. - Collaborate with cross-functional teams (product, engineering, business) to understand data needs and deliver scalable solutions. - Implement data quality checks and validation procedures to ensure accuracy and consistency. - Support machine learning model deployment and performance tracking (if applicable). - Document data flows, models, and processes for internal knowledge sharing. Key Requirements - Strong proficiency in SQL (joins, CTEs, window functions, performance tuning). - Solid experience with Python (data manipulation using Pandas, NumPy, scripting, and automation). - Applied knowledge of statistics (hypothesis testing, regression, probability, distributions). - Experience with data automation tools (Airflow, dbt, or equivalent). - Familiarity with data visualization tools (Tableau, Power BI, or Plotly) is a plus. - Understanding of data warehousing concepts (e.g., Snowflake, BigQuery, Redshift). - Strong problem-solving skills and the ability to work independently. Preferred Qualifications - Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or a related field. - Exposure to cloud platforms like AWS, GCP, or Azure. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Category: Technology Location: Shuru is a technology-consulting company that embeds senior product and engineering teams into fast-growing companies worldwide to accelerate growth and de-risk strategy Our work is global, high-stakes, and unapologetically business-first, Role Overview Youll join a lean, senior-only business intelligence team as a Senior Data Analyst who will sit shoulder-to-shoulder with our clients, operating as their in-house analytics brain-trust Your mandate: design the data questions worth asking, own the pipelines that answer them, and convert findings into clear, bottom-line actions If you need daily direction, this isnt for you If you see a vague brief as oxygen, read on, Key Responsibilities Frame the right questions Translate ambiguous product or commercial goals into testable hypotheses, selecting the metrics that truly explain user behaviour and unit economics, Own data end-to-end Model, query, and transform data in SQL and dbt, pushing to cloud warehouses such as Snowflake/BigQuery, with zero babysitting, Build self-service BI Deliver dashboards in Metabase/Looker that non-technical stakeholders can tweak without coming back to you every week, Tell unforgettable stories Turn complex analyses into visuals and narratives that drive decisions in the C-suite and on the sprint board, Guard the data moat Champion data governance, privacy, and quality controls that scale across multiple client engagements, Mentor & multiply Level-up engineers and product managers on analytical thinking, setting coding and insight standards for future analysts, Requirements Must-Have Skills & Experience Minimum Experience of 3 years Core Analytics: Expert SQL; comfort with Python or R for advanced analysis; solid grasp of statistical inference and experimentation, Modern Data Stack: Hands-on with dbt, Snowflake/BigQuery/Redshift, and at least one orchestration tool (Airflow, Dagster, or similar), BI & Visualisation: Proven delivery in Metabase, Looker, or Tableau (including performance tuning for big data models ) Product & Growth Metrics: Demonstrated ability to define retention, activation, and LTV/Payback KPI for SaaS or consumer-tech products, Communication: Relentless clarity; you can defend an insight to both engineers and the CFO, and change course when the data disproves you, Independence: History of thriving with ?figure it out? briefs and distributed teams across time zones, Bonus Points Feature-flag experimentation at scale (e-g , Optimizely, LaunchDarkly), Familiarity with privacy-enhancing tech (differential privacy, data clean rooms), Benefits Work on international projects Execute with founders and execs from around the globe, stacking your playbook fast, Regular team outings We fund quarterly off-sites and virtual socials to keep the remote vibe human, Collaborative & growth-oriented Learn directly from CXOs, leads, and seasoned PMs; no silos, no artificial ceilings, Competitive salary & benefits Benchmark ?90th percentile for similar-stage firms, plus performance upside, Details

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

In the minute it takes you to read this job description, Bluecore has launched over 100,000 individually personalized marketing campaigns for our retail ecommerce customers! Job Title: Product Support Engineer (PSE) Location: Remote About Us At Bluecore, we are revolutionizing the digital marketing space As a Product Support Engineer (PSE), you will help our customers optimize their use of our platform by resolving technical issues, setting up campaigns, and ensuring they get the most value from the tools and features we offer, Who You Are Pride yourself on a job well done You take ownership of the task at hand, ensuring you deliver accurate and effective solutions every time Customer support is a team effort, and you embrace feedback, actively listening to customers and colleagues alike, Collaborative and empathetic You put others first and commit to the right solution, not just your own You enjoy collaborating with others, learning from them, and sharing your own knowledge in a way that benefits the team, Disciplined curiosity When somethings unclear, you approach it head-on, asking the right questions and seeking to expand your technical knowledge Youre passionate about learning and improving, always curious to explore new technologies and share your insights, Customer-focused You understand the bigger picture of what matters to our customers and make sure to communicate clear solutions that address their needs Every interaction is intentional and designed to build confidence toward solving their challenges, Proactive and adaptable You stay ahead of issues, identifying patterns in client problems, and work with internal teams to address them swiftly Youre comfortable working in a 24x7 shift culture to ensure that client issues are addressed around the clock, What Youll Do Client Support & Troubleshooting: Provide expert technical support for clients using Bluecore, BigQuery, Datadog, and other tools Help them resolve issues, optimize campaigns, and maximize the platforms capabilities, Campaign Management: Assist clients in configuring, optimizing, and troubleshooting email/SMS campaigns, including segmentation, automation, and reporting, Problem Resolution: Youll quickly identify technical issues, solve them, and communicate the solution clearly to clients Whether its a data issue or a platform error, youll ensure its resolved efficiently, Collaboration & Knowledge Sharing: Collaborate with Product, Engineering, and Technical Support teams to escalate and resolve complex issues Share patterns, trends, and learnings with your team to help improve the overall customer experience, Continuous Learning: Develop your technical skills through hands-on experience with our tools and contribute to our internal knowledge base Share insights and best practices with the team, Qualifications 1+ years in product support, technical support, or related roles (preferably SaaS, eCommerce, or digital marketing environments), Hands-on experience with tools like Bluecore, BigQuery, Datadog, Looker, or similar platforms, Strong technical troubleshooting skills in a customer-facing role, Excellent written and verbal communication skills, with the ability to simplify complex technical issues for clients, Customer-first attitude, ensuring every interaction is aligned with the customers needs and provides a clear path to resolution, Bachelors degree in Computer Science, Engineering, or a related field (or equivalent experience), Open to 24x7 shift work culture, Why Join Us Work with cutting-edge tools and technologies in the eCommerce and digital marketing space, Competitive salary and benefits, Collaborative and innovative team culture where your contributions make a direct impact, Growth opportunities for continued professional development and learning, More About Us Bluecore is a multi-channel personalization platform that gives retailers a competitive advantage in a digital-first world Unlike systems built for mass marketing and a physical-first world, Bluecore unifies shopper and product data in a single platform, and using easy-to-deploy predictive models, activates welcomed one-to-one experiences at the speed and scale of digital Through Bluecores dynamic shopper and product matching, brands can personalize 100% of communications delivered to consumers through their shopping experiences, anywhere, This Comes To Life In Three Core Product Lines Bluecore Communicate?a modern email service provider (ESP) + SMS Bluecore Site?an onsite capture and personalization product Bluecore Advertise?a paid media product Bluecore is credited with increasing lifetime value of shoppers and overall speed to marketing for more than 400 brands, including Express, Tommy Hilfiger, The North Face, Teleflora and Bass Pro Shops We have been recognized as one of the Best Places to Work by Fortune, Crain's, Forbes and BuiltIn as well as ranked on the Inc 5000, the most prestigious ranking of the nations fastest-growing private companies, We are proud of the culture of flexibility, inclusivity and trust that we have built around our workforce We are a remote first organization with the option to potentially work in our New York headquarters on occasion moving forward We love the opportunity to come together but employees will always have the option on where they work best, At Bluecore we believe in encouraging an inclusive environment in which employees feel encouraged to share their unique perspectives, demonstrate their strengths, and act authentically We know that diverse teams are strong teams, and welcome those from all backgrounds and varying experiences Bluecore is a proud equal opportunity employer We are committed to fair hiring practices and to building a welcoming environment for all team members All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, disability, age, familial status or veteran status, We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment Please contact us to request accommodation,

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleProduction Specialist, AVP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As an Assistant Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Define and track KPIs, SLAs and operational metrics to measure and improve application stability and performance. Conduct real time monitoring to ensure application SLAs are achieved and maximum application availability (up time) using an array of monitoring tools. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Approach support with a proactive attitude, desire to seek root cause, in-depth analysis, and strive to reduce inefficiencies and manual efforts. Mentor and guide junior team members, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Drive service resilience by implementing SRE(site reliability engineering) principles, ensuring proactive monitoring, automation and operational efficiency. Ensure regulatory and compliance adherence, managing audits,access reviews, and security controls in line with organizational policies. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 9-15 years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred Team lead role experience, Experience in an investment bank, financial institution. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP) and hybrid applications. Strong understanding of ITIL /SRE/ DEVOPS best practices for supporting a production environment. Understanding of KPIs, SLO, SLA and SLI Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring, Airflow,Splunk. Working Knowledge of creation of Dashboards and reports for senior management Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading L2 support teams, including managing vendor teams and offshore resources. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleProduct and Change Specialist, VP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As a Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead and drive production support strategy, ensuring alignment with business objectives and SRE/RTB transformation goals Provide thought leadership in implementing ITIL principles to enhance automation, monitoring and operational efficiency. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Guiding technical subject matter experts on the applications being supported including business flows, application architecture, and hardware configuration. Own, define and track KPIs, SLAs, Dashboards and operational metrics to measure and improve application stability and performance. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Fostering a culture of continuous learning, proactive monitoring, and incident prevention. Establish governance frameworks for production support operations, ensuring effective tracking and reporting of incidents, problems and changes Mentor and guiding AVPs, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Evaluate and implement emerging technologies to enhance production support capabilities. Ensure regulatory and compliance adherence, managing audits, access reviews, and security controls in line with organizational policies. Drive Programs and Projects for RTB function across domains Lead Application onboarding for all new applications coming into RTB remit to ensure safe and timely transition Develop executive-level reporting on production health, risk, and stability metrics for senior leadership. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 13-20+ years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred VP or head of domain role experience, Experience in an investment bank, financial institution or Managed Service Industry Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP), AWS and hybrid applications. Understanding of SII and Audit concepts and ability to drive Audit calls Strong understanding of ITIL / DEVOPS best practices for supporting a production environment. Monitoring ToolsKnowledge of Control M, Grafana, Geneos, Google Cloud Monitoring. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading and managing large L2/L3 support teams, including managing vendor teams and offshore resources. across multiple geographies. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Prior experience in automation projects is great to have. Budget and resource planning experience, optimizing operational costs and workforce efficiency. Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 month ago

Apply

8.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Design and develop Looker Dashboards, with apt security provisioning and drill down capabilities. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets. Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes. Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data quality, governance, and security best practices across all layers of the data stack. Write clean, maintainable, and efficient code following best practices. Requirements Data Engineering: 8–12 years of experience in data engineering, with at least 3–5 years hands-on experience specifically in Google Cloud Platform (GCP) and BI tools like Looker. BigQuery (data modeling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Basic to intermediate knowledge of Python for scripting and automation. Exposure to ETL and batch scheduling/ orchestration solutions Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. Looker: Hands on experience in Looker with design, development, configuration/ setup, dashboarding and reporting techniques. Experience building and maintaining LookML models, Explores, PDTs, and semantic layers. Understanding of security provisioning and access controls, performance tuning of dashboard/ reports based on large dataset, building drill down capabilities. Proven ability to design scalable, user-friendly dashboards and self-service analytics environments. Expertise in optimizing Looker performance: materialized views, query tuning, aggregate tables. Strong command over Row-Level Security, Access Filters, and permission sets in Looker to support enterprise-grade data governance. General: Experience with Agile delivery methodologies (e.g. Scrum, Kanban) Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, Looker modeling best practices, and advanced visualization techniques. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 30 Lacs

Gurugram, Chennai

Work from Office

Naukri logo

Key Responsibilities : Lead the design and development of scalable data pipelines using PySpark and ETL frameworks on Google Cloud Platform (GCP) . Own end-to-end data architecture and solutions, ensuring high availability, performance, and reliability. Collaborate with data scientists, analysts, and stakeholders to understand data needs and deliver actionable insights. Optimize complex SQL queries and support advanced data transformations. Ensure best practices in data governance, data quality, and security . Mentor junior engineers and contribute to team capability development. Requirements : 8+ years of experience in data engineering roles. Strong expertise in GCP data services (BigQuery, Dataflow, Pub/Sub, Composer, etc.). Hands-on experience with PySpark and building ETL pipelines at scale. Proficiency in SQL with the ability to write and optimize complex queries. Solid understanding of data modeling, warehousing, and performance tuning. Experience with CI/CD pipelines, version control, and infrastructure-as-code is a plus. Excellent problem-solving and communication skills. Preferred Qualifications : GCP Certification (e.g., Professional Data Engineer). Experience with Airflow, Kubernetes, or Terraform.

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 1 month ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Required Skills & Qualifications: • Education: Bachelors degree in computer science, Information Technology, Engineering, or a related field. • Experience: 7+ years of experience in data engineering, with at least 2 years working with GCP. • Technical Skills: Proficiency in GCP services: Big Query, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Cloud Functions. Strong programming skills in Python, SQL, Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in Big Query. Familiarity with CI/CD pipelines and version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: • Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. • Domain Knowledge: Experience in finance, e-commerce, healthcare domain is a plus.

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Narmadha

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies