Jobs
Interviews

214 Dataproc Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 12 years

13 - 17 Lacs

Gurugram

Work from Office

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 2 months ago

Apply

1 - 3 years

3 - 6 Lacs

Bengaluru

Work from Office

Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom

Posted 2 months ago

Apply

7 - 10 years

16 - 21 Lacs

Mumbai

Work from Office

Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.

Posted 2 months ago

Apply

7 - 11 years

20 - 30 Lacs

Chennai, Bengaluru

Work from Office

Role & responsibilities Role: 6+ years hands on experience in software development. Strong in algorithms and data structures, knowledge of Object-oriented design, Design Patterns, and multi-threaded programming Strong troubleshooting, debugging, and analytical skills. Google cloud development experience Mandatory Skills Google cloud BQ, Data Proc SQL skills (BQ, Hive and Spark) Spark jobs debugging and tuning Python, Java Pyspark Good architecture skills to understand end to end flows/data pipelines Secondary skills Client stakeholder management Certification in relevant skills Responsibilities The candidate must have complete understanding with hands-on experience in design, coding, and testing aspects Design, implement, and support multi-tier software applications, document and test systems, modify as necessary Work independently and with other engineers, and Ops teams, design and develop strategic and tactical processes, and create solutions that meet business requirements. Demonstrate thought processes in solving business and technical problems. Design and code should be easy to maintain, available, performing well, and can be reused across a sub-system or feature. Code may persist for the lifetime of a software version. Own to deliver very high-quality code that is thoroughly tested and is supported by unit tests written in Golang for a flawless execution Can be relied on to deliver features and sub-systems on time and to requirements Works well within a team and contributes effectively to the success of those they interact with regularly. candidate profile

Posted 2 months ago

Apply

4 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.

Posted 2 months ago

Apply

5 - 9 years

10 - 20 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Mandatory Skills: GCS Composer, BigQuery, Azure, Azure Databricks, ADLS. Experience: 5-10 years Good to have skills Knowledge on CDP (customer data platform) Airline knowledge Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver the solutions. Minimum Qualifications: Bachelor's degree in Computer Science, CIS, or related field 5-7 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC)Core Skills

Posted 2 months ago

Apply

2 - 6 years

8 - 12 Lacs

Pune

Work from Office

About The Role : Job Title Senior engineer - (Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location:Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 2 months ago

Apply

4 - 9 years

14 - 19 Lacs

Pune

Work from Office

About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

11 - 21 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 2 months ago

Apply

5 - 10 years

9 - 13 Lacs

Chennai

Work from Office

Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata

Posted 2 months ago

Apply

2 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform

Posted 2 months ago

Apply

3 - 6 years

9 - 13 Lacs

Bengaluru

Work from Office

locationsTower 02, Manyata Embassy Business Park, Racenahali & Nagawara Villages. Outer Ring Rd, Bangalore 540065 time typeFull time posted onPosted 5 Days Ago job requisition idR0000388711 About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. AtTarget, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- Benefits- Culture-

Posted 3 months ago

Apply

5 - 10 years

9 - 19 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :

Posted 3 months ago

Apply

8 - 13 years

12 - 20 Lacs

Bengaluru

Hybrid

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies