Jobs
Interviews

73 Cloud Composer Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

12 - 22 Lacs

chennai

Work from Office

Role & responsibilities Job Summary: As a GCP Data Engineer you will be responsible for developing, optimizing, and maintaining data pipelines and infrastructure. Your expertise in SQL and Python will be instrumental in managing and transforming data, while your familiarity with cloud technologies will be considered an asset as we explore opportunities to enhance data engineering processes. Job Description: Building scalable Data Pipelines Design, implement, and maintain end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from diverse sources. Ensure data pipelines are reliable, scalable, and performance oriented. SQL Expertise: Write and optimize complex SQL queries for data extraction, transformation, and reporting. Collaborate with analysts and data scientists to provide structured data for analysis. Cloud Platform Experience: Utilize cloud services to enhance data processing and storage capabilities. Work towards the integration of tools into the data ecosystem. Documentation and Collaboration: Document data pipelines, procedures, and best practices to facilitate knowledge sharing. Collaborate closely with cross-functional teams to understand data requirements and deliver solutions. Required skills: 4+ years of experience with SQL, Python, 4+ GCP BigQuery, DataFlow, GCS, Dataproc. 4+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Experience with CloudSQL, Cloud Functions and Pub/Sub, Cloud Composer etc., Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

Role Overview: You are required to work as a GCP Data Architect with a total experience of 12+ years. Your relevant experience for engagement should be 10 years. Your primary responsibilities will include maintaining architecture principles, guidelines, and standards, data warehousing, programming in Python/Java, working with Big Data, Data Analytics, and GCP Services. You will be responsible for designing and implementing solutions in various technology domains related to Google Cloud Platform Data Components like BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Key Responsibilities: - Maintain architecture principles, guidelines, and standards - Work on Data Warehousing projects - Program in Python and Java for various data-related tasks - Utilize Big Data technologies for data processing and analysis - Implement solutions using GCP Services such as BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Qualifications Required: - Strong experience in Big Data including data modeling, design, architecting, and solutioning - Proficiency in programming languages like SQL, Python, and R-Scala - Good Python skills with experience in data visualization tools such as Google Data Studio or Power BI - Knowledge of A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Data Engineering, and ETL Data Processing - Migration experience of production Hadoop Cluster to Google Cloud will be an added advantage Additional Company Details: The company is looking for individuals who are experts in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, etc. Relevant certifications such as Google Professional Cloud Architect will be preferred.,

Posted 5 days ago

Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Join Us At Vodafone, we're not just shaping the future of connectivity for our customers - we're shaping the future for everyone who joins our team. When you work with us, you're part of a global mission to connect people, solve complex challenges, and create a sustainable and more inclusive world. If you want to grow your career whilst finding the perfect balance between work and life, Vodafone offers the opportunities to help you belong and make a real impact. What you'll do Conduct end-to-end impact assessments across all subject areas for new demands. Create and maintain comprehensive data architecture documentation including data models, flow diagrams, and technical specifications. Design and implement data pipelines integrating multiple sources, ensuring consistency and quality. Collaborate with business stakeholders to align data strategies with organisational goals. Support software migration and perform production checks. Govern the application of architecture principles within projects. Manage database refresh and decommissioning programmes while maintaining service availability. Ensure correct database configuration and documentation of infrastructure changes. Support third-level supplier engineering teams in root cause analysis and remediation. Propose system enhancements and innovative solutions. Who you are You are a detail-oriented and collaborative professional with a strong foundation in data architecture and cloud technologies. You possess excellent communication skills and are comfortable working with both technical and non-technical stakeholders. You are passionate about creating scalable data solutions and contributing to a culture of continuous improvement. What skills you need Strong knowledge of Teradata systems and related products. Proficient in SQL and data modelling concepts. Experience with GCP tools including Cloud Composer, BigQuery, Pub/Sub, and Cloud Functions. Proven ability to communicate complex data concepts effectively. Experience in IT infrastructure management environments. Ability to influence stakeholders and drive customer satisfaction. What skills you will learn Advanced cloud architecture and data governance practices. Cross-functional collaboration and stakeholder engagement. Innovation in data pipeline design and optimisation. Exposure to global BI projects and scalable data solutions. Enhanced leadership and decision-making capabilities. Not a perfect fit Worried that you don't meet all the desired criteria exactly At Vodafone we are passionate about empowering people and creating a workplace where everyone can thrive, whatever their personal or professional background. If you're excited about this role but your experience doesn't align exactly with every part of the job description, we encourage you to still apply as you may be the right candidate for this role or another opportunity. What's in it for you Who we are We are a leading international Telco, serving millions of customers. At Vodafone, we believe that connectivity is a force for good. If we use it for the things that really matter, it can improve people's lives and the world around us. Through our technology we empower people, connecting everyone regardless of who they are or where they live and we protect the planet, whilst helping our customers do the same. Belonging at Vodafone isn't a concept it's lived, breathed, and cultivated through everything we do. You'll be part of a global and diverse community, with many different minds, abilities, backgrounds and cultures. We're committed to increase diversity, ensure equal representation, and make Vodafone a place everyone feels safe, valued and included. If you require any reasonable adjustments or have an accessibility request as part of your recruitment journey, for example, extended time or breaks in between online assessments, please refer to for guidance. Together we can.

Posted 5 days ago

Apply

1.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a GCP Data Engineer, you will play a crucial role in the development, optimization, and maintenance of data pipelines and infrastructure. Your proficiency in SQL and Python will be pivotal in the management and transformation of data. Moreover, your familiarity with cloud technologies will be highly beneficial as we strive to improve our data engineering processes. You will be responsible for building scalable data pipelines. This involves designing, implementing, and maintaining end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from various sources. It is essential to ensure that these data pipelines are reliable, scalable, and performance-oriented. Your expertise in SQL will be put to use as you write and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Collaboration with analysts and data scientists will be necessary to provide structured data for analysis. Experience with cloud platforms, particularly GCP services such as BigQuery, DataFlow, GCS, and Postgres, will be valuable. Leveraging cloud services to enhance data processing and storage capabilities, as well as integrating tools into the data ecosystem, will be part of your responsibilities. Documenting data pipelines, procedures, and best practices will be essential for knowledge sharing within the team. You will collaborate closely with cross-functional teams to understand data requirements and deliver effective solutions. The ideal candidate for this role should have at least 3 years of experience with SQL and Python, along with a minimum of 1 year of experience with GCP services like BigQuery, DataFlow, GCS, and Postgres. Additionally, 2+ years of experience in building data pipelines from scratch in a highly distributed and fault-tolerant manner is required. Comfort with a variety of relational and non-relational databases is essential. Proven experience in building applications in a data-focused role, both in Cloud and Traditional Data Warehouse environments, is preferred. Familiarity with CloudSQL, Cloud Functions, Pub/Sub, Cloud Composer, and a willingness to learn new tools and techniques are desired qualities. Furthermore, being comfortable with big data and machine learning tools and platforms, including open-source technologies like Apache Spark, Hadoop, and Kafka, will be advantageous. Strong oral, written, and interpersonal communication skills are crucial for effective collaboration in a dynamic environment with undefined problems. If you are an inquisitive, proactive individual with a passion for data engineering and a desire to continuously learn and grow, we invite you to join our team in Chennai, Tamil Nadu, India.,

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

bengaluru, karnataka, india

On-site

We are looking for an experienced GCP Data Engineer with 510 years of experience in Google Cloud Platform (GCP) services and Big Data Analytics solutions . This is an exciting opportunity for professionals passionate about designing and implementing scalable data engineering solutions while working on advanced cloud-based projects. Key Responsibilities:- Design, build, and optimize data pipelines using BigQuery, Dataflow, DataProc, and Cloud Composer . Implement Big Data Analytics solutions leveraging Hadoop, Hive, and Spark on GCP. Write efficient queries and scripts with SQL, PySpark, and Python for large-scale data processing. Work with orchestration and messaging tools like Airflow, Kafka, Git, Jenkins , and manage CI/CD pipelines. Collaborate with cross-functional teams, bringing strong problem-solving skills and communication abilities . Apply Agile methodologies in project delivery and leverage data visualization tools (Tableau / MicroStrategy) for business insights. Must-Have Skills:- Hands-on experience in GCP Data Engineering (BigQuery, Dataflow, DataProc, PySpark, Python, Cloud Composer). Proficiency with Tableau or MicroStrategy (MSTR) for reporting and dashboards. Strong foundation in data modeling, optimization, and pipeline orchestration . Show more Show less

Posted 6 days ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

pune

Hybrid

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

mumbai, maharashtra, india

Remote

Position Title Lead Infrastructure Engineer- Integration Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager - Integration Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that works as an exte nsion of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the provided Purpose of the role We have exciting opportunity for Lead Infrastructure Engineers to work with General Mill's various Advanced Digital Transformation teams and partner to achieve required business outcomes by transforming, simplifying, integrating and managing our services in cloud. We are seeking a highly skilled MuleSoft Platform & Automation Engineer with expertise in managing and automating the MuleSoft ecosystem. The ideal candidate should have strong experience in MuleSoft Platform Administration, Automation, and CI/CD along with exposure to Mule Development. Additional knowledge of Cloud Composer (Airflow), GCP, Terraform, Confluent Kafka and scripting is a plus. Primary responsibilities include Administration and Automation excellence, consultation, optimization and implementation. KEY ACCOUNTABILITIES Manage and automate MuleSoft AnyPoint Platform configurations, including API management, security, deployments, monitoring and governance etc. Troubleshoot the production issues and provide root cause. Setup and support highly available Integration Platform Infrastructure (IaaS/PaaS). Automate LastMile Security Certificate Management and renewal processes. Implement CI/CD pipelines to streamline MuleSoft application deployments. Develop self-healing and proactive monitoring solutions for MuleSoft applications and APIs. Work with Developers to triage production bugs Manage a queue of cases and work with users and other support teams to troubleshoot production issues. Provide 24.7 on-call production support once every month on rotational basis. Integrate GitHub with MuleSoft Design Centre and automate code deployment & rollback mechanisms. Implement infrastructure automation using Terraform for MuleSoft environments. Ensure high availability and fault tolerance of MuleSoft applications through proper configurations and failover strategies. Would need to support Enterprise platform with rotational on-call support. MINIMUM QUALIFICATIONS Education - Full time graduation from an accredited university (Mandatory- Note: This is the minimum education criteria which cannot be altered) Over 10+ years of experience in IT industry, minimum of 8+ years of administration or operations experience in the Mule area. Strong Experience with Runtime Fabric and Hybrid Deployment Model Expertise in Deployment strategies, Mule Clustering, Mule Gateway, MUnit MuleSoft MMC. Experience troubleshooting/Managing Runtime Servers, experience on Mule Connectors (Standard/Custom). Provide technical consultation on MuleSoft platform best practices, security, and governance. Should have considerable knowledge of API development using Mule Platform. Strong experience with Anypoint Platform, Cloud Hub, RTF and API Management. Strong understanding and experience with security implementations (e.g., SSL/mutual SSL, SAML, oAuth). Hands-on experience with MuleSoft API Gateway, RTF, Anypoint Monitoring, and Security. Experience in monitoring and alerting for MuleSoft APIs and applications. Strong knowledge of CI/CD tools such as GitHub Actions and GitHub. Experience in Infrastructure as Code (IaC) using Terraform. Excellent troubleshooting skills in dynamic environments. Familiarity with Agile methodologies & modern Software Engineering principles Strong problem-solving skills and ability to work in a fast-paced environment PREFERRED QUALIFICATIONS Strong aptitude to learn and passion for problem solving. Excellent communication skills in coordinating with different stakeholders. Good to have knowledge on Kafka, Google Cloud Platform- GCP and its services like IAM, Apigee, Composer Compute, Storage etc. Nice to have working knowledge of Service Oriented Architecture (SOA) and associated concepts such as XML Schemas, WS specifications, RESTful APIs, SOAP, Service Mediation/ESB, Digital certificates, Messaging, etc Good to have knowledge of Python, Shell scripting, or other automation languages. Familiarity with Agile methodologies & modern Software Engineering principles Nice to have exposure to Kubernetes and container orchestration. Strong knowledge of infrastructure components, including networking, storage, and compute in cloud environments. Familiarity with Agile methodologies Familiarity with modern Software Engineering principles Workflow Management: Google Cloud Composer & Airflow jobs Cloud: Google cloud platform (GCP) Experience managing Google Cloud Composer, Tidal Scheduler from a platform/infrastructure perspective. Terraform Knowledge Hands-on experience with monitoring and logging tools for performance and issue resolution

Posted 1 week ago

Apply

7.0 - 9.0 years

0 Lacs

hyderabad, telangana, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist Experienced in Google Cloud development and well-versed with Big Query. Adopts best technologies and methodologies in release management and quality control. Corporate with IT and Finance Operations in daily activities. To influence key stakeholders to achieve the best-desired outcome. Responsible for translating detailed designs into robust, scalable and reusable solutions that deliver exceptional user experience and communicate the design and key design decisions to related parties. Carrying out the detailed technical analysis of projects, changes and implementations to production Work on multiple activities like design, development and testing Requirements To be successful in this role, you should meet the following requirements: A minimum of 7+ years IT software development experiences in GCP Big Query, Python, Shell programming, Cloud composer, Airflow etc. Hands on big query and Python is a must. DevOps hands on will be added advantage. A track record of delivering change on architecting, implementing, and supporting enterprise-grade technical solutions, preferably in a financial institution. Strong experience in data analysis and developing code. Strong experience in GCP DevOps tools that can help in CICD implementation. A basic understanding of finance knowledge such as Accounting is preferred. Strong work ethic and team player experienced working with Scrum/Agile development methodologies Real passion for elegance and efficiency in software engineering and always strive for continuous improvements via automated processes Knowledge on FTP Big Data will have added advantage. Possessing knowledge on AGILE and SCRUM methodologies would be an added advantage Hands-on experience with GIT repo and efficient branch management skills. Good communication skills in English, both written and verbal You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Responsibilities Develop technical solutions for Data Engineering and work between 1 PM and 10 PM IST to enable more overlap time with European and North American counterparts. This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards. Design and deploying data pipelines with automated data lineage. Develop, reusable Data Engineering patterns. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability. Position Opportunities The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals: Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe. Explore and implement leading edge technologies, tooling and software development best practices. Experience of managing data warehousing and product delivery within a financially regulated environment. Experience of collaborative development practices within an open-plan, team-designed environment. Experience of working with third party suppliers / supplier management. Continued personal and professional development with support and encouragement for further certification. Qualifications Essential: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). 5+ years of SQL development experience. 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team. Experience developing with micro service architecture from container orchestration framework. Designing pipelines and architectures for data processing. Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support. Evidence of a proactive mindset to problem solving and willingness to take the initiative. Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines. Desired: Professional Certification in GCP (e.g., Professional Data Engineer). Data engineering or development experience gained in a regulated, financial environment. Experience with Teradata to GCP migrations is a plus. Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam. Experience of coaching and mentoring Data Engineers. Experience with data security, governance, and compliance best practices in the cloud. An understanding of current architecture standards and digital platform services strategy.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a qualified candidate for this role, you should have in-depth expertise in Google Cloud Platform (GCP) services such as Pubsub, BigQuery, Airflow, Data Proc, Cloud Composer, and Google Cloud Storage (GCS). Additionally, proficiency in DataFlow and Java is a must for this position. Experience with Kafka would be considered a plus. Your responsibilities will include working with these technologies to design, develop, and maintain scalable and efficient data processing systems. If you meet these requirements and are eager to work in a dynamic and innovative environment, we look forward to reviewing your application.,

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

mumbai, maharashtra, india

Remote

Position Title Sr Infrastructure Engineer- Integration Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager - Integration Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that wo rks as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the provided Purpose of the role Purpose of the role We have exciting opportunity for Sr. Infrastructure Engineers to work with General Mill's various Advanced Digital Transformation teams and partner to achieve required business outcomes by transforming, simplifying, integrating and managing our services in cloud. We are seeking a highly skilled MuleSoft Platform & Automation Engineer with expertise in managing and automating the MuleSoft ecosystem. The ideal candidate should have strong experience in MuleSoft Platform Administration, Automation, and CI/CD along with exposure to Mule Development. Additional knowledge of Cloud Composer (Airflow), GCP, Terraform, Confluent Kafka and scripting is a plus. Primary responsibilities include Administration and Automation excellence, consultation, optimization and implementation. KEY ACCOUNTABILITIES Manage and automate MuleSoft AnyPoint Platform configurations, including API management, security, deployments, monitoring and governance etc. Troubleshoot the production issues and provide root cause. Setup and support highly available Integration Platform Infrastructure (IaaS/PaaS). Automate LastMile Security Certificate Management and renewal processes. Implement CI/CD pipelines to streamline MuleSoft application deployments. Develop self-healing and proactive monitoring solutions for MuleSoft applications and APIs. Work with Developers to triage production bugs Manage a queue of cases and work with users and other support teams to troubleshoot production issues. Provide 24.7 on-call production support once every month on rotational basis. Integrate GitHub with MuleSoft Design Centre and automate code deployment & rollback mechanisms. Implement infrastructure automation using Terraform for MuleSoft environments. Ensure high availability and fault tolerance of MuleSoft applications through proper configurations and failover strategies. Would need to support Enterprise platform with rotational on-call support. MINIMUM Q UALIFICATIONS Education - Full time graduation from an accredited university (Mandatory- Note: This is the minimum education criteria which cannot be altered) Over 10+ years of experience in IT industry, minimum of 8+ years of administration or operations experience in the Mule area. Strong Experience with Runtime Fabric and Hybrid Deployment Model Expertise in Deployment strategies, Mule Clustering, Mule Gateway, MUnit MuleSoft MMC. Experience troubleshooting/Managing Runtime Servers, experience on Mule Connectors (Standard/Custom). Provide technical consultation on MuleSoft platform best practices, security, and governance. Should have considerable knowledge of API development using Mule Platform. Strong experience with Anypoint Platform, Cloud Hub, RTF and API Management. Strong understanding and experience with security implementations (e.g., SSL/mutual SSL, SAML, oAuth). Hands-on experience with MuleSoft API Gateway, RTF, Anypoint Monitoring, and Security. Experience in monitoring and alerting for MuleSoft APIs and applications. Strong knowledge of CI/CD tools such as GitHub Actions and GitHub. Experience in Infrastructure as Code (IaC) using Terraform. Excellent troubleshooting skills in dynamic environments. Familiarity with Agile methodologies & modern Software Engineering principles Strong problem-solving skills and ability to work in a fast-paced environment PREFERRED QUALIFICATIONS Strong aptitude to learn and passion for problem solving. Excellent communication skills in coordinating with different stakeholders. Good to have knowledge on Kafka, Google Cloud Platform- GCP and its services like IAM, Apigee, Composer Compute, Storage etc. Nice to have working knowledge of Service Oriented Architecture (SOA) and associated concepts such as XML Schemas, WS specifications, RESTful APIs, SOAP, Service Mediation/ESB, Digital certificates, Messaging, etc Good to have knowledge of Python, Shell scripting, or other automation languages. Familiarity with Agile methodologies & modern Software Engineering principles Nice to have exposure to Kubernetes and container orchestration. Strong knowledge of infrastructure components, including networking, storage, and compute in cloud environments. Familiarity with Agile methodologies Familiarity with modern Software Engineering principles Workflow Management: Google Cloud Composer & Airflow jobs Cloud: Google cloud platform (GCP) Experience managing Google Cloud Composer, Tidal Scheduler from a platform/infrastructure perspective. Terraform Knowledge Hands-on experience with monitoring and logging tools for performance and issue resolution

Posted 1 week ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

chennai

Work from Office

Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Experience Required: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

You have over 10 years of experience in data engineering, specializing in cloud-based solutions. Your role involves designing solutions, reviewing team work, and providing guidance. Proficiency in Google Cloud Platform (GCP) and its various data services such as BigQuery, DBT & Streaming, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer is essential. Your track record should demonstrate your ability to create scalable data pipelines and architectures. Experience with ETL tools, processes, and implementing ETL processes to transfer data to GCP warehouses like BigQuery is required. Your technical skills should include proficiency in DBT & Streaming, Dataflow, Cloud Storage, Cloud Composer, BigQuery, Cloud SQL, Cloud Firestore, Cloud Bigtable, Airflow, and GCP Cloud Data Catalog. You must be adept at SQL, database design, and optimization. Strong programming skills in Python, Java, or other relevant languages are necessary. Familiarity with data modeling, data warehousing, big data processing frameworks, data visualization tools like Looker and Data Studio, and machine learning workflows and tools will be advantageous. In addition to technical expertise, soft skills are crucial for this role. You should possess strong analytical and problem-solving skills, excellent communication and collaboration abilities, and the capacity to work efficiently in a fast-paced environment while managing multiple tasks. Leadership skills are key as you will be guiding and mentoring team members, providing assistance, and breaking down milestones into technical tasks. Estimation skills are also essential for successful project management.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Google Cloud DevOps Engineer specializing in Terraform and CI/CD Pipeline, you will play a crucial role in provisioning GCP resources based on architectural designs that align with business objectives. Your responsibilities will include monitoring resource availability and usage metrics to provide guidelines for cost and performance optimization. You will be expected to assist IT and business users in resolving GCP service-related issues and provide guidance on cluster automation and migration approaches. Additionally, your role will involve provisioning GCP resources for data engineering and data science projects, including automated data ingestion, migration, and transformation. Key Responsibilities: - Building complex CI/CD pipelines for cloud-native PaaS services on Google Cloud Platform - Developing deployment pipelines using Github CI (Actions) - Creating Terraform scripts for infrastructure deployment - Working on deployment and troubleshooting of Docker, GKE, Openshift, and Cloud Run - Utilizing Cloud Build, Cloud Composer, and Dataflow for various tasks - Configuring software for monitoring with Appdynamics and stackdriver logging in GCP - Setting up dashboards using tools like Splunk, Kibana, Prometheus, and Grafana Skills, Experience, and Qualifications: - Total of 5+ years of experience in DevOps, with a minimum of 4 years in Google Cloud and Github CI - Strong background in Microservices/API and DevOps tools like Github CI, TeamCity, Jenkins, and Helm - Proficiency in deployment and testing strategies for applications in Google Cloud Platform - Experience in defining development, test, release, update, and support processes for DevOps operations - Familiarity with Java and knowledge of Kafka, ZooKeeper, Hazelcast, Pub/Sub - Understanding of cloud networking, security, identity access, and compute runtime - Proficiency in managing databases such as Oracle, Cloud SQL, and Cloud Spanner - Excellent troubleshooting skills and working knowledge of open-source technologies - Awareness of Agile principles and experience in Agile/SCRUM environments - Comfortable with Agile team management tools like JIRA and Confluence - Strong communication skills and the ability to work in self-organized teams - Certification in Google Professional Cloud DevOps Engineer is desirable If you possess a proactive attitude, a strong team spirit, and a desire to continuously improve and innovate in the field of DevOps on Google Cloud Platform, we welcome you to join our dynamic team.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Aptiv, you will play a crucial role in designing, developing, and implementing a cost-effective, scalable, reusable, and secured Ingestion framework. Your primary responsibility will be to work closely with business leaders, stakeholders, and source system Subject Matter Experts (SMEs) to understand and define the business needs, translate them into technical specifications, and ingest data into Google Cloud Platform, specifically BigQuery. You will be involved in designing and implementing processes for data ingestion, transformation, storage, analysis, modeling, reporting, monitoring, availability, governance, and security of high volumes of structured and unstructured data. Your role will involve developing and deploying high-throughput data pipelines using the latest Google Cloud Platform (GCP) technologies, serving as a specialist in data engineering and GCP data technologies, and engaging with clients to understand their requirements and translate them into technical data solutions. You will also be responsible for analyzing business requirements, creating source-to-target mappings, enhancing ingestion frameworks, and transforming data according to business rules. Additionally, you will develop capabilities to support enterprise-wide data cataloging, design data solutions with a focus on security and privacy, and utilize Agile and DataOps methodologies in project delivery. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Data & Analytics, or a similar relevant field, along with at least 4 years of hands-on IT experience in a similar role. You should possess proven expertise in SQL, including subqueries, aggregations, functions, triggers, indexes, and database optimization, as well as deep experience working with various Google Data Products such as BigQuery, Dataproc, Data Catalog, Dataflow, Cloud SQL, among others. Experience in tools like Qlik replicate, Spark, and Kafka is also required. Strong communication skills, the ability to work with globally distributed teams, and knowledge of statistical methods and data modeling are essential for this role. Experience with designing and creating Tableau, Qlik, or Power BI dashboards, as well as knowledge of Alteryx and Informatica Data Quality, will be beneficial. Aptiv provides an inclusive work environment where individuals can grow and develop, irrespective of gender, ethnicity, or beliefs. Safety is a core value at Aptiv, aiming for a world with zero fatalities, zero injuries, and zero accidents. The company offers a competitive health insurance package to support the physical and mental health of its employees. Additionally, Aptiv provides benefits such as personal holidays, healthcare, pension, tax saver scheme, free onsite breakfast, discounted corporate gym membership, and access to transportation options at the Grand Canal Dock location. If you are passionate about data engineering, GCP technologies, and driving value creation through data analytics, Aptiv offers a challenging and rewarding opportunity to grow and make a meaningful impact in a dynamic and innovative environment.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a GCP Data Engineer specialized in Data Migration & Transformation, you will be responsible for designing and constructing robust, scalable data pipelines and architectures on Google Cloud Platform (GCP), particularly focusing on BigQuery. Your primary tasks will involve migrating and transforming large-scale data systems and datasets to GCP while emphasizing performance, scalability, and reliability. It will be crucial for you to automate data lineage extraction and ensure data integrity across various systems and platforms. Collaborating closely with architects and stakeholders, you will play a key role in implementing GCP-native and 3rd-party tools for data ingestion, integration, and transformation. Additionally, your role will include the development and optimization of complex SQL queries in BigQuery for data analysis and transformation. You will be expected to operationalize data pipelines using tools such as Apache Airflow (Cloud Composer), DataFlow, and Pub/Sub, enabling machine learning capabilities through well-structured, ML-friendly data pipelines. Participation in Agile processes and contributing to technical design discussions, code reviews, and documentation will be integral parts of your responsibilities. Your background should include at least 5 years of experience in Data Warehousing, Data Engineering, or similar roles, with a minimum of 2 years of hands-on experience working with GCP BigQuery. Proficiency in Python, SQL, Apache Airflow, and various GCP services including BigQuery, DataFlow, Cloud Composer, Pub/Sub, and Cloud Functions is essential. You should possess experience in data pipeline automation, data modeling, and building reusable data products. A solid understanding of data lineage, metadata integration, and data cataloging, preferably using tools like GCP Data Catalog and Informatica EDC, will be beneficial. Demonstrated ability to analyze complex datasets, derive actionable insights, and build/deploy analytics platforms on cloud environments, preferably GCP, is required. Preferred skills for this role include strong analytical and problem-solving capabilities, exposure to machine learning pipeline architecture and model deployment workflows, excellent communication skills, and the ability to collaborate effectively with cross-functional teams. Familiarity with Agile methodologies, DevOps best practices, a self-driven and innovative mindset, and experience in documenting complex data engineering systems and developing test plans will be advantageous for this position.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Role: GCP Data Engineer Experience: 5-9 Years Notice: 15 Days less Interview Mode: First Round Virtual/ Second Round Face to face (Mandate) Location: Bangalore Job Description Data Ingestion, Storage, Processing and Migration Acquire, cleanse, and ingest structured and unstructured data on the cloud platforms (in batch or real time) from internal and external data sources Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Create, maintain and provide test data to support fully automated testing Enable and support data movement from one system / service to another system / service. Reporting : Design, Develop and maintain high performance Look ML models that provide comprehensive data visibility across business function. Build interactive dashboards and data visualization that tell compelling stories and drive decision making. Stay up to date with the latest Looker Features and best practices ,Sharing your knowledge with the team. Skills & Software Requirements: GCP data services (Big Query; Dataflow; Data Fusion; Data proc; Cloud Composer; Pub/Sub; Google Cloud Storage; Looker; Look ML) Programming languages e.g., Python, Java, SQL Show more Show less

Posted 2 weeks ago

Apply

9.0 - 11.0 years

0 Lacs

pune, maharashtra, india

On-site

Data Engineer (ETL, Python, SQL, GCP) Position Overview Job Title: Data Engineer (ETL, Python, SQL, GCP) Corporate Title: AVP Location: Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in agile methodologies ELT ,ETL data movement and data processing skills. Work with cloud composer to manage and process batch jobs efficiently and develop and optimize complex SQL queries for data analysis extraction and transformation. Develop and deploy Google cloud services using terraform. Monitor and troubleshoot data pipelines resolving any issues in a timely manner. Ensure team collaboration using Jira confluence and other tools. Write advanced SQL and Python scripts. Certification on professional Google cloud data engineer will be an added advantage. Hands on experience on GB cloud composer, data flow ,big query, cloud function, cloud run and DataProc and as well have GKE. Experience in GitHub and git actions with Experience in CI/CD Proficient in terraform Your skills and experience More than 9 years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in, Python, SQL , terraform Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems . Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: We strive for a in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are urgently required to join as a Senior BigQuery Developer (Google Cloud Platform) with a minimum experience of 5-8 years in Hyderabad. In this role, you will be responsible for designing, developing, and maintaining robust, scalable data pipelines and advanced analytics solutions using BigQuery and other GCP-native services. Your primary focus will be on designing, developing, and optimizing BigQuery data warehouses and data marts to support analytical and business intelligence workloads. You will also need to implement data modeling and best practices for partitioning, clustering, and table design in BigQuery. Integration of BigQuery with tools such as Dataform, Airflow, Cloud Composer, or dbt for orchestration and version control will be essential. Ensuring compliance with security, privacy, and governance policies related to cloud-based data solutions is a critical aspect of the role. Monitoring and troubleshooting data pipelines and scheduled queries for accuracy and performance are also part of your responsibilities. It is imperative to stay up-to-date with evolving BigQuery features and GCP best practices to excel in this position. As a part of the job benefits, you will receive a competitive salary package along with medical insurance. The role will provide exposure to numerous domains and projects, giving you the opportunity for professional training and certifications, all sponsored by the company. Clear and defined career paths for professional development and exposure to the latest technologies are assured. The hiring/selection process involves one HR interview followed by one technical interview. The company, FIS Clouds (www.fisclouds.com), is a global leader in digital technology and transformation solutions for enterprises, with global offices in India, the US, the UK, and Jakarta, Indonesia. FIS Clouds strongly believes in Agility, Speed, and Quality, and applies constant innovation to solve customer challenges and enhance business outcomes. The company specializes in Cloud Technologies, including Public Cloud, Private Cloud, Multi-Cloud, Hybrid Cloud, DevOps, Java, Data Analytics, and Cloud Automation. Note: The salary package is not a limiting factor for the right candidate. Your performance will determine the package you earn.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for designing and developing scalable data engineering solutions using Google Cloud Platform (GCP) and PySpark. Your main focus will be on optimizing Spark jobs for performance, scalability, and efficient resource utilization. You will also be involved in developing, maintaining, and enhancing ETL pipelines using BigQuery, Apache Airflow, and Cloud Composer. Collaborating with data scientists, analysts, and DevOps teams to translate business requirements into technical solutions will be a key aspect of your role. Ensuring data integrity and security by implementing data governance, compliance, and security best practices will be crucial. Monitoring production workloads, troubleshooting performance issues, and implementing enhancements will also be part of your responsibilities. You will be expected to implement and enforce coding standards, best practices, and performance tuning strategies. Additionally, supporting migration activities from on-premises data warehouses to GCP-based solutions will be part of your duties. Mentoring junior developers and contributing to knowledge-sharing within the team will be essential. Staying up to date with emerging cloud technologies, tools, and best practices in the data engineering ecosystem is also a key requirement. The ideal candidate should have + years of total experience in data engineering with + years of hands-on experience with Google Cloud Platform (GCP), including BigQuery, Apache Airflow, and Cloud Composer. Strong expertise in developing and optimizing large-scale data processing solutions using PySpark and Python is necessary. In-depth knowledge of SQL for data transformation and performance optimization is also required. Proficiency in big data technologies such as Hadoop, HDFS, Hive, and YARN is essential. Experience with distributed computing principles, data partitioning, and fault tolerance is preferred. Hands-on experience with CI/CD pipelines, version control (Git), and automation tools is a plus. Strong problem-solving, analytical, and troubleshooting skills are important for this role. Experience working in Agile/Scrum environments is beneficial. Excellent communication and collaboration skills are necessary to work with offshore and onshore teams. In this role, you will have the opportunity to work in a collaborative environment where your contributions create value. You will develop innovative solutions, build lasting relationships with colleagues and clients, and have access to global capabilities to bring your ideas to life. Your career will evolve in a company built to grow and last, supported by leaders who care about your well-being and provide opportunities for skill development and growth. Join us, one of the world's largest IT and management consulting companies, and become a part of a team dedicated to making a difference in the technology industry.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for designing and implementing cloud-native and hybrid solutions using GCP services such as Compute Engine, Kubernetes (GKE), Cloud Functions, BigQuery, Pub/Sub, Cloud SQL, and Cloud Storage. Additionally, you will define cloud adoption strategies, migration plans, and best practices for performance, security, and scalability. You will also be required to implement and manage Terraform, Cloud Deployment Manager, or Ansible for automated infrastructure provisioning. The ideal candidate should have expertise as a GCP data architect with network domain skills in GCP (DataProc, cloud composer, data flow, BQ), python, spark Py spark, and hands-on experience in the network domain, specifically in 4G, 5G, LTE, and RAN technologies. Knowledge and work experience in these areas are preferred. Moreover, you should be well-versed in ETL architecture and data pipeline management. This is a full-time position with a day shift schedule from Monday to Friday. The work location is remote, and the application deadline is 15/04/2025.,

Posted 2 weeks ago

Apply

11.0 - 16.0 years

0 Lacs

karnataka

On-site

It is exciting to be part of a company where individuals genuinely believe in the purpose of their work. The commitment to infuse passion and customer-centricity into the business is unwavering. Fractal stands out as a key player in the field of Artificial Intelligence. The core mission of Fractal is to drive every human decision within the enterprise by integrating AI, engineering, and design to support the most esteemed Fortune 500 companies globally. Recognized as one of India's top workplaces by The Great Place to Work Institute, Fractal is at the forefront of innovation in Cloud, Data, and AI technologies, fueling digital transformation across enterprises at an unprecedented pace exceeding 100%. At Fractal, we empower enterprises to leverage the potential of data on the cloud through a spectrum of services such as Architecture consulting, Data Platforms, Business-Tech platforms, Marketplaces, Data governance, MDM, DataOps, and AI/MLOps. Additionally, we employ AI engineering methodologies to enhance each facet of our offerings. As part of the team, your responsibilities will include evaluating the existing technological landscape and formulating a progressive, short-term, and long-term technology strategic vision. You will actively contribute to the creation and dissemination of best practices, technical content, and innovative reference architectures. Collaborating with data engineers and data scientists, you will play a pivotal role in architecting solutions and frameworks. Moreover, your role will involve ensuring the seamless delivery of services, products, and solutions to our clientele. To excel in this role, you should possess a wealth of experience as an Architect with a strong background in Google Cloud Platform and a genuine interest in leveraging cutting-edge technologies to address business challenges. The ideal candidate will have 11 to 16 years of experience in Data Engineering & Cloud Native technologies, particularly Google Cloud Platforms, encompassing big data, analytics, and AI/ML domains. Proficiency in tools such as BigQuery, Cloud Composer, Data Flow, Cloud Storage, AI Platform/Vertex AI, Dataproc, and GCP IaaS is essential. A solid understanding of Data Engineering, Data Management, and Data Governance is required, along with experience in leading End-to-End Data Engineering and/or Analytics projects. Knowledge of programming languages like Python and Java, coupled with a grasp of technology best practices and development lifecycles such as agile, CI/CD, DevOps, and MLOps, is highly valued. Demonstrable expertise in technical architecture leadership, secure platform development, and creation of future-proof global solutions using GCP services is crucial. Excellent communication and influencing skills are imperative, allowing adaptability to diverse audiences. Desirable skills include experience in Container technology like Docker and Kubernetes, DevOps on GCP, and a Professional Cloud Architect Certification from Google Cloud. If you are drawn to dynamic growth opportunities and enjoy collaborating with energetic, high-achieving individuals, your career journey with us promises to be fulfilling and rewarding.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining scalable and efficient data pipelines using GCP services such as Dataflow, Cloud Composer (Airflow), and Pub/Sub. Your role will involve designing and implementing robust data models optimized for analytical and operational workloads within GCP data warehousing solutions like BigQuery. You will also be tasked with developing and implementing ETL processes to ingest, cleanse, transform, and load data from various sources into our data warehouse and other data stores on GCP. Furthermore, you will play a key role in building and managing data warehousing solutions on GCP, ensuring data integrity, performance, and scalability. Collaboration with data scientists and analysts to understand their data requirements and provide them with clean, transformed, and readily accessible data will be an essential aspect of your responsibilities. Monitoring and troubleshooting data pipelines and data warehousing systems to ensure data quality and availability will also be part of your duties. In addition, you will implement data quality checks and validation processes to maintain the accuracy and reliability of data. Optimizing data processing and storage for performance and cost-efficiency on the GCP platform will be crucial. Staying up-to-date with the latest GCP data services and best practices in data engineering and analytics is essential. You will also contribute to the documentation of data pipelines, data models, and ETL processes and work collaboratively with cross-functional teams to understand data needs and deliver effective data solutions. Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field. - Proven experience in data engineering principles and practices. - Solid understanding of data modeling concepts and techniques (e.g., relational, dimensional). - Hands-on experience in designing, developing, and implementing ETL processes. - Knowledge of data warehousing concepts, architectures, and best practices. - Familiarity with data analytics principles and the data lifecycle. - Strong problem-solving and analytical skills with the ability to troubleshoot data-related issues. - Excellent communication and collaboration skills. Preferred Qualifications (A Plus): - Hands-on experience working with Google Cloud Platform (GCP) data services such as BigQuery, Dataflow, Cloud Composer (Airflow), Pub/Sub, Cloud Storage, etc. - Experience with scripting languages such as Python or SQL for data manipulation and automation. - Familiarity with data visualization tools (e.g., Looker, Tableau). - Experience with data governance and data security principles. - Knowledge of DevOps practices and tools for data infrastructure automation.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Consultant Delivery (Data Engineer) at Worldline, you will be an integral part of the Data Management team, contributing to a significant Move to Cloud (M2C) project. Your primary focus will be migrating our data infrastructure to the cloud and enhancing our data pipelines for improved performance and scalability. You will have the opportunity to work on a critical initiative that plays a key role in the organization's digital transformation. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with a minimum of 5 years of experience as a Data Engineer. Your expertise should include a strong emphasis on cloud-based solutions, particularly within the Google Cloud Platform (GCP) ecosystem. Key technical skills and qualifications required for this role include: - Proficiency in version control systems and CI/CD pipelines. - In-depth knowledge of GCP services such as DataProc, Dataflow, Cloud Functions, Workflows, Cloud Composer, and BigQuery. - Extensive experience with ETL tools, specifically dbt Labs, and a deep understanding of ETL best practices. - Demonstrated ability to design and optimize data pipelines, architectures, and datasets from various data sources. - Strong proficiency in SQL and Python, including experience with Spark. - Exceptional analytical and problem-solving abilities to translate complex requirements into technical solutions. Additionally, possessing relevant certifications in Google Cloud Platform or other data engineering credentials would be advantageous. Experience in migrating data from on-premises data warehouses to cloud-based solutions, working with large-scale datasets, and executing complex data transformations are desirable skills for this role. Effective communication and interpersonal skills are also crucial for successful collaboration within a team environment. Joining Worldline as a Data Engineer presents a unique opportunity to work on a forward-thinking project that leverages cutting-edge technologies and fosters a culture of diversity and inclusion. If you are looking to make a meaningful impact in a dynamic and innovative environment, we encourage you to consider this exciting opportunity with us.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

faridabad

Work from Office

Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. Ensure performance tuning, security, and optimization of the Snowflake data warehouse. Document metadata, data lineage, and business logic behind data structures and flows. Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills Snowflake architecture, schema design, and data warehouse experience. DBT (Data Build Tool) for data transformation and pipeline development. Strong expertise in SQL (query optimization, complex joins, window functions, etc.). Hands-on experience with Erwin Data Modeler (logical and physical modeling). Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have Experience with CI/CD tools and DevOps for data environments. Familiarity with data governance, security, and privacy practices. Exposure to Agile methodologies and working in distributed teams. Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills Excellent problem-solving and analytical skills. Strong communication and stakeholder management. Self-driven with the ability to work independently in a remote setup. Skills: gcp,erwin,dbt,sql,data modeling,dbeaver,bigquery,query optimization,dataflow,cloud storage,snowflake,erwin data modeler,data pipelines,data transformation,datamodeler

Posted 2 weeks ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies