Home
Jobs

170 Dataproc Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 9 years

6 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role : Ab Initio Skills:Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database:SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX:Shell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project Profiles:Atleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Primary Skills Abinitio Graphs Secondary Skills SQL Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 3 months ago

Apply

7 - 9 years

9 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Overview We're looking for a motivated Team Lead to manage the transformation of data pipelines from Hadoop to BigQuery. You should have strong experience in Google Cloud Platform (GCP), BigQuery, and Dataproc, as well as proficiency in Linux/Unix environments. Experience with PySpark, Scala or Java, shell scripting, Apache Airflow, and unit testing is essential. Knowledge in Kafka and GCP Looker is optional. Responsibilities Mandatory Skills: GCP Data Engineer Certification Experience in pipeline migration from Hadoop to BigQuery. Proficiency with GCP, BigQuery, and Dataproc. Strong skills in Linux/Unix environments. Knowledge of PySpark, Scala, or Java. Experience with SQL (Any Query Language), shell scripting and Apache Airflow. Strong understanding of unit testing principles. Optional Skills: Familiarity with Kafka. Knowledge of GCP Looker. Requirements Bachelors degree (in any field). 7-9 years of work experience. Adapt/ Experience working in multi-channel delivery projects is desirable. Strong problem-solving and analytical skills. Excellent communication and leadership abilities.

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Mumbai

Hybrid

Naukri logo

6+ years experience writing, debugging, and troubleshooting code in mainstream Core Java. Experience with Cloud technology: GCP, AWS, or Azure ( Added Advantage) Design and build scalable and reliable data pipelines using Google Cloud services Required Candidate profile Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Performance monitoring using Google Data Studio or similar tools.

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Bangalore Rural

Hybrid

Naukri logo

6+ years experience writing, debugging, and troubleshooting code in mainstream Core Java. Experience with Cloud technology: GCP, AWS, or Azure ( Added Advantage) Design and build scalable and reliable data pipelines using Google Cloud services Required Candidate profile Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Performance monitoring using Google Data Studio or similar tools.

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Pune

Hybrid

Naukri logo

6+ years experience writing, debugging, and troubleshooting code in mainstream Core Java. Experience with Cloud technology: GCP, AWS, or Azure ( Added Advantage) Design and build scalable and reliable data pipelines using Google Cloud services Required Candidate profile Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Performance monitoring using Google Data Studio or similar tools.

Posted 3 months ago

Apply

3 - 7 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at least: Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team

Posted 3 months ago

Apply

10 - 14 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedback: seek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadership: values individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coaching: understands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking Wealth Management experience Must have experience working in agile organization.

Posted 3 months ago

Apply

15 - 20 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs Our values define the working environment we strive to create diverse, supportive and welcoming of different views. We embrace a culture reflecting a variety of perspectives, insights and backgrounds to drive innovation. We build talented and diverse teams to drive business results and encourage our people to develop to their full potential. Talk to us about flexible work arrangements and other initiatives we offer. We promote good working relationships and encourage high standards of conduct and work performance. We welcome applications from talented people from all cultures, countries, races, genders, sexual orientations, disabilities, beliefs and generations and are committed to providing a working environment free from harassment, discrimination and retaliation. Visit to discover more about the culture of Deutsche Bank including Diversity, Equity & Inclusion, Leadership, Learning, Future of Work and more besides.

Posted 3 months ago

Apply

15 - 20 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Job Title: Lead Engineer (RYR#2025) Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 months ago

Apply

3 - 7 years

11 - 18 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role: GCP Data Engg Client: MNC (full time) Position: Permanent Exp: 3.5-6 years Location: PAN India NP: Immediate/Serving/30 DAYS Work mode : Hybrid/WFO Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL JD : To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLA?s/KPI?s/OLA?s. Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure the project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote teamwork, motivate, mentor and develop subordinates. Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no Alternate no Email id Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp & Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC & ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots

Posted 3 months ago

Apply

7 - 12 years

0 Lacs

Bengaluru, Noida

Work from Office

Naukri logo

Role & responsibilities Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. This role requires strong leadership skills and expertise in Google Cloud Data Services. Roles & Responsibilities: - Lead the team in designing, building, and configuring applications according to project requirements. - Act as the primary point of contact for all application-related matters, providing guidance and support to team members. - Collaborate with multiple teams to gather requirements, define project scope, and ensure timely delivery of applications. - Manage the team's performance and make decisions regarding resource allocation and task assignments. - Engage with stakeholders to understand their needs and provide solutions to problems for both the immediate team and across multiple teams. - Contribute to key decisions related to application design, architecture, and functionality. - Provide technical expertise and guidance to team members, ensuring the successful implementation of applications. - Stay updated with the latest trends and advancements in application development and recommend innovative solutions to improve processes and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Data Services. - Strong understanding of application development principles and best practices. - Experience in designing and implementing scalable and secure applications. - Hands-on experience with Google Cloud Platform services, such as Cloud Storage, BigQuery, and Dataflow. - Knowledge of programming languages such as Java, Python, or Go. - Familiarity with DevOps practices and tools for continuous integration and deployment. - Experience in working with Agile methodologies and tools like Jira or Trello. - Excellent problem-solving and analytical skills. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Data Services. - A 15 years full-time education is required. Preferred candidate profile Perks and benefits

Posted 3 months ago

Apply

13 - 20 years

30 - 40 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Title : GCP Data Engineer Experience : 14+ Years Location : Chennai/Hyderabad Skillset : GCP, BigQuery, Dataflow, Terraform, AirFlow, CloudSql, Cloud Storage, cloud Composer, Pubsub If anyone interested, Kindly drop CV to sharmeelasri26@gmail.com

Posted 3 months ago

Apply

6 - 11 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Role: GCP Data Engineer Location: Bangalore Experience: 6-12 years Mode: work from office Job Description: We are seeking a talented GCP Data Engineer to join our team and help us design and implement robust data pipelines and analytics solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery , DataFlow , Cloud Composer , and DataProc , along with experience in AI/ML tools such as Google Vertex AI or Dialogflow . Key Responsibilities: Design, develop, and maintain data pipelines and workflows using DataFlow , Cloud Composer , and DataProc . Develop optimized queries and manage large-scale datasets using BigQuery . Collaborate with cross-functional teams to gather requirements and translate business needs into scalable data solutions. Implement best practices for data engineering, including version control, CI/CD pipelines, and data governance. Work on AI/ML use cases, leveraging Google Vertex AI or Dialogflow to create intelligent solutions. Perform data transformations, aggregations, and ETL processes to prepare data for analytics and reporting. Monitor and troubleshoot data workflows to ensure reliability, scalability, and performance. Document technical processes and provide guidance to junior team members. Qualifications: Experience: 3-5+ years of professional experience in GCP data engineering or related fields. Skills: Proficiency in BigQuery , DataFlow , Cloud Composer , and DataProc . Exposure to Google Vertex AI , Dialogflow , or other AI/ML platforms. Strong programming skills in Python , SQL , and familiarity with Terraform for GCP infrastructure. Experience with distributed data processing frameworks like Apache Spark is a plus. Knowledge of data security, governance, and best practices for cloud platforms

Posted 3 months ago

Apply

3 - 8 years

8 - 18 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Title : Association/Senior/Lead -Data Engineer Experience : 3 to 8 Years Location : Chennai/Hyderabad Required Skill : GCP, BigQuery, Python/Hadoop, Teradata/DataProc, Airflow. Regards, Sharmeela Sharmeela.s@saaconsulting.co.in

Posted 3 months ago

Apply

12 - 17 years

35 - 60 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. Zoominfo is a rapidly growing data-driven company, and as such- we understand the importance of a comprehensive and solid data solution to support decision making in our organization. Our vision is to have a consistent, democratized, and accessible single source of truth for all company data analytics and reporting. Our goal is to improve decision-making processes by having the right information available when it is needed. As a Principal Software Engineer in our Data Platform infrastructure team you'll have a key role in building and designing the strategy of our Enterprise Data Engineering group. What You'll do: Design and build a highly scalable data platform to support data pipelines for diversified and complex data flows. Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities. Deliver scalable, reliable and reusable data solutions. Leading, building and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms. Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs. Develop processes and tools to monitor, analyze, maintain and improve data operation, performance and usability. What you bring: Relevant Bachelor degree or other equivalent Software Engineering background. 12+ years of experience as an infrastructure / data platform / big data software engineer. Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena. IaC design and hands-on experience. Familiarity designing CI/CD pipelines with Jenkins, Github Actions, or similar tools. Experience in designing, building and maintaining enterprise systems in a big data environment on public cloud. Strong SQL abilities and hands-on experience with SQL, performing analysis and performance optimizations. Hands-on experience in Python or equivalent programming language. Experience with administering data warehouse solutions (like Bigquery/ Redshift/ Snowflake). Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance. Experience with Airflow and DBT - advantage Experience with Kubernetes using GKE or EKS - advantage.. Experience with development practices Agile, TDD - advantage

Posted 3 months ago

Apply

10 - 19 years

22 - 30 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Location : Chennai/Bangalore/Noida/Hyderabad Hive, Python, Java or SQL, Hadoop, Spark, ETL, GCP BigQuery, Cloud SQL, Dataflow, Dataproc, Cloud build, cloud run, cloud functions, pub-sub, cloud composer Data lake, Multi cloud (e.g.,Google Cloud, AWS, Azure)

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Gurgaon

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 3 months ago

Apply

3 - 6 years

6 - 13 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role & Responsibilities Expertise in GCP (BigQuery, Dataproc) , SQL/HQL , and Linux/Unix . Experience in at least one programming language and unit testing principles . Knowledge of Hadoop-to-BigQuery migration , Apache Airflow , Kafka , and Shell Scripting is a plus. Proficiency in Python/PySpark, Scala, or Java and familiarity with GCP Looker is desirable. Ability to develop and optimize data pipelines in cloud-based environments.

Posted 3 months ago

Apply

3 - 6 years

6 - 16 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Working knowledge of Google Cloud Platform (GCP) , specifically BigQuery and Dataproc . Hands-on experience with any Query Language ( SQL/HQL ). Experience working in Linux/Unix environments . Proficiency in at least one programming language ( Python, PySpark, Scala, or Java ). Solid understanding of unit testing principles . Preferred candidate profile Exposure to pipeline migration from Hadoop to BigQuery . Familiarity with Apache Airflow for workflow automation. Knowledge of Shell Scripting . Experience with Kafka for real-time data streaming. Understanding of GCP Looker for data visualization.

Posted 3 months ago

Apply

3 - 7 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.

Posted 3 months ago

Apply

3 - 7 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Job Description A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.

Posted 3 months ago

Apply

8 - 13 years

19 - 32 Lacs

Chennai, Hyderabad, Noida

Work from Office

Naukri logo

Location : Chennai/Bangalore/Noida/Hyderabad Hive, Python, Java or SQL, Hadoop, Spark, ETL, GCP BigQuery, Cloud SQL, Dataflow, Dataproc, Cloud build, cloud run, cloud functions, pub-sub, cloud composer Data lake, Multi cloud (e.g.,Google Cloud, AWS, Azure)

Posted 3 months ago

Apply

7 - 11 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Google Cloud Data Services Good to have skills : Apache Spark, Google Cloud Dataflow Minimum 7.5 year(s) of experience is required Educational Qualification : Bachelors of engineering Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :Google Cloud Data ServicesGood to Have Skills : Apache Spark, Google Cloud DataflowJob Requirements :Key Responsibilities :1 Strong knowledge of GCP services such as Cloud Storage, Big Query, Dataflow, Dataproc, Cloud Composer, Pub/Sub, Airflow, DAG etc 2 Experience in data and analytics, including cloud technologies 3 Experience of Finance/Revenue domain will be an added advantage 4 Experience with GCP Migration activities will be an added advantage 5 Experience in SDLC with emphasis on specifying, building, and testing mission critical business applications Technical Experience :1 Should have worked on Hadoop/Big data project and good SQL experiencehive,Big Query 2 Should be comfortable with git, jenkinsCI/CD 3 Should be good in Python/Hadoop/Spark 4 Strong knowledge of GCP services, especially Big Query, data warehouse concepts 5 Designing, implementing, and maintaining data infrastructure and pipelines on the Google Cloud Platform GCP Professional Attributes :1 Strong analytical, inter personal communication skills 2 Must possess impeccable communication skills, both in verbal and in written form 3 Proficient in identifying, analyzing and solving problems 4 Client facing experience Educational Qualification:Bachelors of engineeringAdditional Info :Level flex, Location - only look for Bengaluru & Gurugram ACN facility. Qualification Bachelors of engineering

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Network Infrastructures Good to have skills : Cisco Identity Services Engine (ISE) Minimum 5 year(s) of experience is required Educational Qualification : -Bachelor degree in information technology, software engineering, computer science, or a related Summary :As a Cloud Services Engineer, you will be responsible for ensuring Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Your typical day will involve acting as a liaison between the client and Accenture operations teams for support and escalations, communicating service delivery health to all stakeholders, and holding performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Act as a liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Professional & Technical Skills: Primary Skill:Network Infrastructures Good To Have Skills:Cisco Identity Services Engine (ISE) Experience in Cloud orchestration and automation. Experience in managing and monitoring Cloud infrastructure. Experience in troubleshooting and resolving Cloud infrastructure issues. Additional Information: The candidate should have a minimum of 5 years of experience in Network Infrastructures. The JOB FAMILY and PROJECT ROLE information are not for candidate's experience. This position is based at our Gurugram office. Qualifications -Bachelor degree in information technology, software engineering, computer science, or a related

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Google Cloud SQL, Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Python Data Engineer About The Role :As a Big Data Engineer, you will be responsible for designing and developing data pipelines using Python and Dataproc/DataFlow/Composer. Your role involves performing moderate to complex data transformations and derivations. Here are the key responsibilities and qualifications:Duties & Responsibilities:Design and develop data pipelines in Python.Load data from disparate sources and preprocess it .Provide data engineering expertise to multiple teams across the organization.Lead efforts to document source-to-target mappings and design logical data models.Optimize data performance and ease of use.Collaborate with other designers and architects within Business Intelligence and IT. Qualifications:Relevant education (Bachelor's or Master's Degree) in Computer Science, Engineering, Statistics, or related fields.Proficiency in SQL, Python.Familiarity with distributed systems and data architecture.Experience working in cloud environments (e.g., GCP).Strong algorithmic concepts in computer science Qualifications 15 years full time education

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies