Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
9 - 19 Lacs
chennai
Work from Office
Role & responsibilities Develop, Enhance and Maintain Data Integration workflows to process data from different sources Strong development experience on building data pipelines on Google Cloud Platform services GCP Experience: Big Query, Cloud Storage, Cloud Dataproc, Cloud Dataflow Should have strong SQL knowledge on Big Query / Any SQL databases etc. Self-starter, ability to leverage various resources available to quickly learn and try out new concepts and technologies
Posted Date not available
3.0 - 8.0 years
9 - 19 Lacs
chennai
Work from Office
Role & responsibilities Develop, Enhance and Maintain Data Integration workflows to process data from different sources Strong development experience on building data pipelines on Google Cloud Platform services GCP Experience: Big Query, Cloud Storage, Cloud Dataproc, Cloud Dataflow Should have strong SQL knowledge on Big Query / Any SQL databases etc. Self-starter, ability to leverage various resources available to quickly learn and try out new concepts and technologies
Posted Date not available
2.0 - 6.0 years
7 - 11 Lacs
bengaluru
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies; defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform
Posted Date not available
5.0 - 7.0 years
7 - 9 Lacs
bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted Date not available
3.0 - 8.0 years
7 - 17 Lacs
chennai
Work from Office
Role & responsibilities 5+ years of relevant work experience and increasing responsibility Develop, Enhance and Maintain Data Integration workflows to process data from different sources Strong development experience on building data pipelines on Google Cloud Platform services GCP Experience: Big Query, Cloud Storage, Cloud Dataproc, Cloud Dataflow Should have strong SQL knowledge on Big Query / Any SQL databases etc. Should have experience in writing scripts in Pyspark / Python Ability to work independently in a semi structured environment, applications often have frequently changing or vaguely defined requirements Should have knowledge on Astronomer Airflow Well versed in Agile solution development methodologies Self-starter, ability to leverage various resources available to quickly learn and try out new concepts and technologies Organizational, analytical, problem-solving skills Strong interpersonal, communication and presentation skills
Posted Date not available
3.0 - 8.0 years
8 - 18 Lacs
chennai
Work from Office
Role & responsibilities 5+ years of relevant work experience and increasing responsibility Develop, Enhance and Maintain Data Integration workflows to process data from different sources Strong development experience on building data pipelines on Google Cloud Platform services GCP Experience: Big Query, Cloud Storage, Cloud Dataproc, Cloud Dataflow Should have strong SQL knowledge on Big Query / Any SQL databases etc. Should have experience in writing scripts in Pyspark / Python Ability to work independently in a semi structured environment, applications often have frequently changing or vaguely defined requirements Should have knowledge on Astronomer Airflow Well versed in Agile solution development methodologies Self-starter, ability to leverage various resources available to quickly learn and try out new concepts and technologies Organizational, analytical, problem-solving skills Strong interpersonal, communication and presentation skills
Posted Date not available
2.0 - 6.0 years
7 - 11 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform
Posted Date not available
5.0 - 7.0 years
13 - 17 Lacs
kochi
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted Date not available
3.0 - 7.0 years
11 - 15 Lacs
mumbai
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering. Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value. Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes. Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.
Posted Date not available
15.0 - 20.0 years
5 - 9 Lacs
chennai
Work from Office
Project Role :Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Manager (AEM) UI Development, Adobe Analytics, Java Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing user interfaces, and ensuring that applications function seamlessly to enhance user experience and meet organizational goals. You will also participate in testing and troubleshooting to ensure high-quality deliverables, while continuously seeking opportunities for improvement in application performance and user satisfaction. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Adobe Experience Manager (AEM) UI Development, Java, Adobe Analytics, Google Analytics, SQL, GCP, Big Query and Spark.- Good To Have Skills: Experience with front-end technologies such as HTML, CSS, and JavaScript.- Strong understanding of content management systems and their implementation.- Experience in integrating third-party services and APIs.- Familiarity with agile development methodologies and tools.- Working knowledge of Adobe Marketing Suite, Campaign tracking and Adobe Analytics.- Strong programming knowledge in Big Query, Cloud SQL, Data Proc and Pyspark.- Understanding and experience with UNIX / Shell / Perl / Python scripting. Additional Information:- The candidate should have minimum 3 years of experience in Adobe Experience Manager (AEM) UI Development.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
8.0 - 13.0 years
35 - 50 Lacs
bengaluru, delhi / ncr
Work from Office
Exp 8 Years + Skill Data Solutions Architect Google Cloud Platform to Architect and design end-to-end data solutions leveraging awide array of GCP services including BigQuery, Dataflow, Dataproc, Pub/Sub Cloud Storage Cloud Composer and Data Catalog
Posted Date not available
15.0 - 20.0 years
5 - 9 Lacs
chennai
Work from Office
Project Role :Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Manager (AEM) UI Development, Adobe Analytics, Java Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing user interfaces, and ensuring that applications function seamlessly to enhance user experience and operational efficiency. You will also participate in testing and troubleshooting to ensure that the applications meet the highest standards of quality and performance. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Adobe Experience Manager (AEM) UI Development, Java, Adobe Analytics, Google Analytics, SQL, GCP, Big Query and Spark.- Good To Have Skills: Experience with front-end technologies such as HTML, CSS, and JavaScript.- Strong understanding of content management systems and their implementation.- Experience in integrating third-party services and APIs.- Familiarity with agile development methodologies and tools.- Working knowledge of Adobe Marketing Suite, Campaign tracking and Adobe Analytics.- Strong programming knowledge in Big Query, Cloud SQL, Data Proc and Pyspark.- Understanding and experience with UNIX / Shell / Perl / Python scripting. Additional Information:- The candidate should have minimum 3 years of experience in Adobe Experience Manager (AEM) UI Development.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
5 - 9 Lacs
chennai
Work from Office
Project Role :Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Manager (AEM) UI Development, Adobe Analytics, Java Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing user interfaces, and ensuring that applications function seamlessly. You will also engage in problem-solving activities, providing innovative solutions to enhance application performance and user experience, while maintaining a focus on quality and efficiency throughout the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Adobe Experience Manager (AEM) UI Development, Java, Adobe Analytics, Google Analytics, SQL, GCP, Big Query and Spark.- Good To Have Skills: Experience with front-end technologies such as HTML, CSS, and JavaScript.- Strong understanding of content management systems and their implementation.- Experience in integrating third-party services and APIs.- Familiarity with agile development methodologies and tools.- Working knowledge of Adobe Marketing Suite, Campaign tracking and Adobe Analytics.- Strong programming knowledge in Big Query, Cloud SQL, Data Proc and Pyspark.- Understanding and experience with UNIX / Shell / Perl / Python scripting. Additional Information:- The candidate should have minimum 7.5 years of experience in Adobe Experience Manager (AEM) UI Development.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
3.0 - 7.0 years
11 - 15 Lacs
bengaluru
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A strong grasp of the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass: Community Engagement: Actively participating in the professional data platform engineering community, sharing insights, and staying up-to-date with the latest trends and best practices. Project Contributions: Making substantial contributions to client delivery, particularly in the design, construction, and maintenance of cloud-based data platforms and infrastructure. Technical Expertise: Demonstrating a sound understanding of data platform engineering principles and knowledge in areas such as cloud data storage solutions (e.g., AWS S3, Azure Data Lake), data processing frameworks (e.g., Apache Spark), and data orchestration tools. Independent Work and Initiative: Taking ownership of independent tasks, displaying initiative and problem-solving skills when confronted with intricate data platform engineering challenges. Emerging Leadership: Commencing leadership roles, which may encompass mentoring junior engineers, leading smaller project teams, or taking the lead on specific aspects of data platform projects.
Posted Date not available
8.0 - 13.0 years
6 - 10 Lacs
hyderabad
Work from Office
Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle
Posted Date not available
7.0 - 12.0 years
6 - 9 Lacs
hyderabad
Work from Office
Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"
Posted Date not available
6.0 - 11.0 years
6 - 9 Lacs
hyderabad
Work from Office
At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities.
Posted Date not available
5.0 - 10.0 years
4 - 8 Lacs
noida
Work from Office
Your Role IT experience with a minimum of 5+ years of experience in creating data warehouses, data lakes, ETL/ELT, data pipelines on cloud. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. preferably - Life Sciences Domain Experience with cloud storage, cloud database, cloud Data ware housing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Experience with cloud storage, cloud database, cloud Data ware housing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Your Profile Able to contribute in making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. Knowledge on IOT and real time streaming would be added advantage.Lead architectural/technical discussions with client. Excellent communication and presentation skills.
Posted Date not available
3.0 - 8.0 years
15 - 25 Lacs
hyderabad, chennai
Hybrid
Job Title: GCP Data Engineer BigQuery, Airflow, SQL, Python, dbt Experience Required: 3+ Years Location: Chennai / Hyderabad (Preferred – 2nd round will be F2F) Notice Period: Immediate Joiners preferred / Candidates with 30 days notice period (serving notice welcome) Employment Type: Full-time Job Description: We are looking for a skilled GCP Data Engineer with strong hands-on experience in BigQuery, Airflow, SQL, Python, and dbt to work on high-impact data engineering projects. Key Responsibilities: Design, develop, and optimize data pipelines on GCP Work with BigQuery for data warehousing and analytics Orchestrate workflows using Airflow Develop and maintain data transformation scripts using Python and dbt Collaborate with analytics and business teams to deliver data solutions Ensure best practices in performance optimization, data quality, and security Required Skills & Experience: Minimum 3 years experience as a Data Engineer Hands-on experience with Google Cloud Platform services Strong SQL skills Experience with Airflow for job scheduling/orchestration Expertise in Python scripting for data processing Experience with dbt for data transformation Strong problem-solving and communication skills Interview Process: 3 technical rounds 2nd round will be Face-to-Face at Chennai or Hyderabad office How to Apply: Interested candidates (Chennai / Hyderabad profiles preferred) can share their CV to ngongadala@randomtrees.com with subject line: "GCP Data Engineer – Chennai/Hyderabad
Posted Date not available
5.0 - 8.0 years
2 - 6 Lacs
mumbai
Work from Office
Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such as Neo Dash, Tableau and PowerBI The Expertise You Have: Bachelors or Masters Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Demonstrable experience in implementing data solutions in Graph DB space. Hands-on experience with graph databases (Neo4j(Preferred), or any other). Experience Tuning Graph databases. Understanding of graph data model paradigms (LPG, RDF) and graph language, hands-on experience with Cypher is required. Solid understanding of graph data modelling, graph schema development, graph data design. Relational databases experience, hands-on SQL experience is required. Desirable (Optional) skills: Data ingestion technologies (ETL/ELT), Messaging/Streaming Technologies (GCP data fusion, Kinesis/Kafka), API and in-memory technologies. Understanding of developing highly scalable distributed systems using Open-source technologies. Experience in Supply Chain Data is desirable but not essential. Location: Pune, Mumbai, Chennai, Bangalore, Hyderabad
Posted Date not available
2.0 - 4.0 years
4 - 8 Lacs
mumbai
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery, Microsoft SQL Server, GitHub, Google Cloud Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements : Roles & Responsibilities:- 1:Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)2:Proven track record of delivering data integration, data warehousing soln3:Strong SQL And Hands-on (No FLEX)4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX)5:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes6:Exp in cloud solutions, mainly data platform services , GCP Certifications5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : Professional & Technical Skills: -1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills2:Must have ability to collaborate with different teams and suggest solutions3:Ability to work independently with little supervision or as a team4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Additional Information :Candidate should be ready for Shift B and work as individual contributor Qualification 15 years full time education
Posted Date not available
2.0 - 4.0 years
4 - 8 Lacs
mumbai
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery, Microsoft SQL Server, GitHub, Google Cloud Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements : Roles & Responsibilities:- 1:Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)2:Proven track record of delivering data integration, data warehousing soln3:Strong SQL And Hands-on (No FLEX)4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX)5:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes6:Exp in cloud solutions, mainly data platform services , GCP Certifications5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : Professional & Technical Skills: -1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills2:Must have ability to collaborate with different teams and suggest solutions3:Ability to work independently with little supervision or as a team4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Additional Information :Candidate should be ready for Shift B and work as individual contributor Qualification 15 years full time education
Posted Date not available
4.0 - 9.0 years
10 - 20 Lacs
bengaluru
Work from Office
Job Title: GCP Data Engineer Work Mode: Onsite 4 days a week Base Location: Bangalore Experience Required: 4+ Years Job Summary: We are looking for a GCP Data Engineer with strong expertise in BigQuery and hands-on experience in building scalable data pipelines and analytical solutions on Google Cloud Platform. The ideal candidate will have a solid background in data modeling, ETL/ELT processes, and performance optimization, with exposure to other GCP data services like Dataflow, Pub/Sub, and Dataproc. Key Responsibilities: Design, develop, and optimize data pipelines and ETL processes using BigQuery as the primary data warehouse. Implement data models, partitioning, and clustering strategies for high-performance analytics. Work with Dataflow, Pub/Sub, and Composer (Airflow) for real-time and batch processing pipelines. Collaborate with cross-functional teams to integrate data solutions into business workflows. Ensure data quality, governance, and security standards are met. Perform performance tuning and cost optimization for BigQuery and related GCP services. Troubleshoot and resolve production data issues, ensuring reliability and scalability. Required Skills: 4+ years of experience in data engineering with strong focus on BigQuery . Proficiency in SQL and experience with Python for data processing. Hands-on experience with GCP services Dataflow, Pub/Sub, Dataproc, Composer. Strong understanding of data modeling, partitioning, and performance optimization . Knowledge of CI/CD pipelines and version control tools (Git). Preferred Skills: Experience with streaming data pipelines and real-time analytics. Exposure to Terraform or other Infrastructure as Code (IaC) tools. Familiarity with data lake architectures and hybrid data solutions. Basic understanding of ML pipelines on GCP (e.g., Vertex AI) is a plus. Tech Stack: Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Dataproc, Composer) Programming: Python, SQL Orchestration: Airflow (Composer) Version Control / CI-CD: Git, Cloud Build, Jenkins Focus Areas: BigQuery optimization, scalable data pipelines, GCP ecosystem integration
Posted Date not available
15.0 - 20.0 years
35 - 40 Lacs
pune
Work from Office
: Job TitleLead Engineer LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted Date not available
5.0 - 7.0 years
7 - 9 Lacs
gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |