Jobs
Interviews

55 Cloud Orchestration Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

16 - 20 Lacs

pune

Work from Office

About The Role : Job Title Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) LocationPune, India Corporate Title As sistant Vice President Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 10 years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted Date not available

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Dataproc, Apache Spark Good to have skills : Apache Airflow Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Dataproc. Your typical day will involve working with Apache Spark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google Dataproc. Collaborate with cross-functional teams to deliver impactful data-driven solutions. Utilize Apache Spark for data processing and analysis. Develop and maintain technical documentation for applications. Professional & Technical Skills: Strong expereince in Apache Spark and Java for Spark. Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP) Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow) Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc) Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series) Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP) Comfortable communicating with various stakeholders (technical and non-technical) GCP Data Engineer Certification is a nice to have Additional Information: The candidate should have a minimum of 5 years of experience in Google Dataproc and Apache Spark. The ideal candidate will possess a strong educational background in software engineering or a related field. This position is based at our Mumbai office. Qualifications minimum 15 years of fulltime education

Posted Date not available

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted Date not available

Apply

3.0 - 8.0 years

15 - 25 Lacs

hyderabad, chennai

Hybrid

Job Title: GCP Data Engineer BigQuery, Airflow, SQL, Python, dbt Experience Required: 3+ Years Location: Chennai / Hyderabad (Preferred – 2nd round will be F2F) Notice Period: Immediate Joiners preferred / Candidates with 30 days notice period (serving notice welcome) Employment Type: Full-time Job Description: We are looking for a skilled GCP Data Engineer with strong hands-on experience in BigQuery, Airflow, SQL, Python, and dbt to work on high-impact data engineering projects. Key Responsibilities: Design, develop, and optimize data pipelines on GCP Work with BigQuery for data warehousing and analytics Orchestrate workflows using Airflow Develop and maintain data transformation scripts using Python and dbt Collaborate with analytics and business teams to deliver data solutions Ensure best practices in performance optimization, data quality, and security Required Skills & Experience: Minimum 3 years experience as a Data Engineer Hands-on experience with Google Cloud Platform services Strong SQL skills Experience with Airflow for job scheduling/orchestration Expertise in Python scripting for data processing Experience with dbt for data transformation Strong problem-solving and communication skills Interview Process: 3 technical rounds 2nd round will be Face-to-Face at Chennai or Hyderabad office How to Apply: Interested candidates (Chennai / Hyderabad profiles preferred) can share their CV to ngongadala@randomtrees.com with subject line: "GCP Data Engineer – Chennai/Hyderabad

Posted Date not available

Apply

10.0 - 15.0 years

5 - 15 Lacs

bengaluru

Work from Office

A GCP Data Engineering Architect designs, implements, and manages scalable data solutions on Google Cloud Platform (GCP). They are responsible for defining data architecture, ensuring data quality and security, and optimizing data pipelines for various use cases, including data warehousing, big data processing, and real-time analytics. This role involves collaborating with stakeholders, mentoring junior engineers, and staying up-to-date with the latest GCP technologies. Skills Required: 12~16 years' experience in IT or professional services experience in IT delivery or large-scale IT data engineering projects. Data Pipeline Development: Building and optimizing data pipelines using GCP services like BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, and Cloud Run. Designing and Architecting Data Solutions: Developing end-to-end data solutions on GCP, including data models, storage strategies, data ingestion, processing, and consumption frameworks. Data Security and Governance: Implementing data security frameworks, establishing data governance policies, and ensuring compliance with data quality and privacy standards. Data Warehousing and Big Data: Solid understanding of data warehousing concepts, big data processing frameworks, and ETL/ELT processes Mentoring and Leadership: Providing technical guidance, mentoring junior team members, and contributing to the overall data engineering strategy. Staying Updated: Keeping abreast of the latest GCP services, data architecture trends, and best practices. Expert knowledge in SQL development. Required Skills Key Skills - Data engineering Architecture, GCP services like BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, and Cloud Run, technical mentoring

Posted Date not available

Apply
Page 3 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies