Jobs
Interviews

288 Cloud Storage Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

What the Role Offers: Design, develop new product features using Microservices, Golang, File System and could. Work on cloud storage like AWS, Azure, and Google Cloud Analyze technical requirements, design applications, identify new technologies, and integrate solutions into existing product. Responsible for successful completion of code deliverables within the project Develop prototypes and do live code demos. Work with agile team to ensure required information for testing is provided. Conduct peer code-reviews to ensure adherence to patterns and standards. Promote a practice of learning by being available to mentor other developers. Participate in the user story grooming and estimation sessions, t-shirt sizing to refine product backlog. Development of software in an agile development environment Involved in the various phases of the SDLC, secure coding and reviews. What you need to succeed: Bachelors or masters degree in computer science, Information Systems, or equivalent. Experience in full stack development with Golang and JavaScript frameworks, and knowledge web technologies. Experience in Docker, Microservices, Web Development, and deployment. Experience in working with Cloud storage, AWS, Azure, Google Cloud. Knowledge of databases, LDAP and security. Knowledge of Operating System Linux or Windows. Experience with security tools, Static Analysis tools. Knowledge of threat modelling Excellent problem solving and analytical thinking skills. Good interpersonal, written, and verbal communication skills A strong desire to develop your career through self-learning and scheduled personal development time. ONE LAST THING: You are persistent and inquisitive. You have to understand why things are happening the way they are. You are determined to understand cyber attack techniques at a very detailed level. You are a self-starter who is able to work with minimal management, however have strong collaboration and interpersonal skills to work together with several other professionals from other information security fields. Youre a creative thinker who wants to answer the question, Why? Your workstation is a pyramid of monitors that you can't take your eyes off of at the risk of missing something. You have a desire to learn new technologies. Your sense of humor, passion and enthusiasm shines through in everything you do.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

">MLOps Engineer 6-10 Years Bengaluru ML About the Role We are seeking a highly experienced and innovative Senior Machine Learning Engineer to join our AI/ML team. In this role, you will lead the design, development, deployment, and monitoring of scalable machine learning solutions using GCP Vertex AI , MLflow , and other modern ML tools. You ll work closely with data scientists, engineers, and product teams to bring intelligent systems into production that drive real business impact. Key Responsibilities Design, develop, and deploy end-to-end machine learning models in production environments using GCP Vertex AI Manage the full ML lifecycle including data preprocessing, model training, evaluation, deployment, and monitoring Implement and maintain MLflow pipelines for experiment tracking, model versioning, and reproducibility Collaborate with cross-functional teams to understand business requirements and translate them into ML solutions Optimize model performance and scalability using best practices in MLOps and cloud-native architecture Develop reusable components and frameworks to accelerate ML development and deployment Monitor deployed models for drift, performance degradation, and retraining needs Ensure compliance with data governance, security, and privacy standards Required Skills & Qualifications 6+ years of experience in machine learning engineering or applied data science Strong proficiency in Python , SQL , and ML libraries such as scikit-learn , TensorFlow , or PyTorch Hands-on experience with GCP Vertex AI for model training, deployment, and pipeline orchestration Deep understanding of MLflow for experiment tracking, model registry, and lifecycle management Solid grasp of MLOps principles and tools (e.g., CI/CD for ML, Docker, Kubernetes) Experience with cloud data platforms (e.g., BigQuery, Cloud Storage) and distributed computing Strong problem-solving skills and ability to work independently in a fast-paced environment Excellent communication skills and ability to explain complex ML concepts to non-technical stakeholders Preferred Qualifications Experience with other cloud platforms (AWS SageMaker, Azure ML) is a plus Familiarity with feature stores, model monitoring tools, and data versioning systems Contributions to open-source ML projects or publications in ML conferences

Posted 3 weeks ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

: Job TitleData Engineer (ETL, Big Data, Hadoop, Spark, GCP), AS Location:Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Hands-on experience for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 7+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. How well support you . . .

Posted 3 weeks ago

Apply

7.0 - 12.0 years

45 - 55 Lacs

Bengaluru

Work from Office

: Job TitleLead Solution Architect, VP LocationBangalore, India Role Description We are seeking a highly skilled and experienced Solution Architect to join CM Tech team owning firmwide golden reference data source cRDS. As a Solution Architect, you will play a pivotal role in shaping the future of CM Tech architecture, leading the development of innovative technical solutions, and contributing to the strategic direction of the application. You will be responsible for defining, documenting, and implementing the overall architecture of cRDS and other client onboarding applications, ensuring its scalability, performance, and security while aligning with business requirements and industry best practices. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define CMT contribution to RCP solutions. Scope solutions to existing and new CMT components. Capture and document assumptions made in lieu of requirement / information for PO to risk accept. Define high-level data entities, functional decomposition. Support component guardians in aligning component roadmap to product strategy and initiative demand. Work with 'CTO' function to define and document. Outline define CMT and non-CMT component interactions and interaction contracts for refinement by engineering teams. Identify problems and opportunities - form business case - propose solutions. Definition of phased transitions from current state to target state. Ensure non-functional requirements are considered and include projection. Ensure Authentication and Authorisation are considered. Ensure solution design is suitable to build estimation and groomable Jiras from. Provide guardrails on what requirements a component should and should not cover - act as point of escalation. Hands-on software development Knowledge of solution design and Architecting Experience in Agile and Scrum delivery. Should be able to contribute towards good software design. Participate in daily stand-up meetings. Strong communication with stakeholders Articulate issues and risks to management in timely manner Train and mentor junior team members to bring them up to speed. Your skills and experience Must have (Strong technical knowledge required) 7+ years of experience in designing and implementing complex enterprise-scale applications. Proven experience in designing and implementing microservices architectures. Deep understanding of distributed systems and cloud-native technologies. Experience with architectural patterns like event-driven architectures, API gateways, and message queues. Strong understanding of Java Core concepts, design patterns, and best practices. Experience with Spring Boot framework, including dependency injection, Spring Data, and Spring Security. Hands-on experience with a BPM tool (Camunda preferred), including process modeling, workflow automation, and integration with backend systems. Experience with Google Cloud Platform, including services like Cloud Run, Cloud SQL, and Cloud Storage desirable. Experience with containerization technologies like Docker and Kubernetes. Strong SQL knowledge and experience with advanced database concepts, including relational database design, query optimization, and transaction management. Experience with version control systems like Git and collaborative development tools like Jira and Confluence. Excellent communication and presentation skills, with the ability to effectively convey complex technical concepts to both technical and non-technical audiences. Strong problem-solving skills, with the ability to analyze complex business problems and propose innovative technical solutions. Experience in collaborating with stakeholders, understanding their needs, and translating them into technical solutions. Technical leadership skills and experience mentoring junior engineers. Nice to have Experience with cloud technologies such as Docker, Kubernetes, Openshift, Azure, AWS, GCP Additional languages such as Kotlin, scala & Python Experience with Big data / Streaming technologies Experience with end to end design and delivery of solutions Experience with UI frameworks like Angular or React RDBMS /Oracle design, development, tuning Sun/Oracle or architecture specific certifications How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai, Chennai, Bengaluru

Work from Office

Your role Develop and implement Generative AI / AI solutions on Google Cloud Platform Work with cross-functional teams to design and deliver AI-powered products and services Work on developing, versioning and executing Python code Deploy models as endpoints in Dev Environment Solid understanding of python Experience with deep learning frameworks such as TensorFlow, PyTorch, or JAX Experience with natural language processing (NLP) and machine learning (ML) Experience on Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI etc Hands on experience with Generative AI support in Vertex, specifically handson experience with Generative AI models like Gemini, vertex Search etc Your Profile Relevant 4-6+ years of experience in AI development Experience with Google Cloud Platform specifically delivering an AI solution on VertexAI platform Experience in developing and deploying AI Solutions What youll love about working here You can shape your with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal, yoga challenges, or marathons. At Capgemini, you can work on in tech and engineering with industry leaders or create to overcome societal and environmental challenges. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Location - Mumbai,Bengaluru,Chennai,Pune,Hyderabad

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist

Posted 3 weeks ago

Apply

3.0 - 6.0 years

25 - 30 Lacs

Mumbai

Work from Office

JD for Data Engineer. We are looking for an experienced & result-driven Data Engineer to join our growing Data Engineering Team . The ideal candidate should be proficient in building scalable, high-performance data transformation pipelines using Snowflake & DBT . In this role you should be instrumental in ingesting, transforming and delivering high quality data to enable data riven decision making across the clients organization. 5-8 plus years of experience as a data engineer and extensive development using Snowflake or similar data warehouse technology. Strong Technical expertise with DBT, Snowflake, PySpark, Apache Airflow, AWS Strong hands-on experience in Design & build robust ELT pipelines using DBT on Snowflake, including ingestion from relational databases, Cloud storage, flat files & API. Enhance dbt/Snowflake workflows through Performance optimization techniques such as clustering, partitioning, query profiling and efficient SQL design Hands-on experience with SQL, Snowflake database design Hands-on experience with AWS, Airflow and GIT Great analytical and problem-solving skills Degree in Computer Science, IT, or similar field; a Masters is a plus

Posted 3 weeks ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

14 - 18 Lacs

Chennai

Work from Office

GCP Architect A Seasoned architect with a minimum of 12+ years and designing medium to large scale Application-to-Application integration requirements leveraging API, APIMs, ESB, product-based hybrid implementation.. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Experience/Exposure for Openshift & PCF on GCP & DevSecOps will be an added advantage Ability to make critical solution design decisions Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Need to have an understanding and designed integration platform to meet the NFR requirements. Should have implemented design patterns like integrating with multiple COTS applications, integrations with multiple databases (SQL based and also NoSQL) Have worked with multiple teams to gather integration requirements, create Integration specification documents, map specifications, write high level and detailed designs, guiding the technical team for design and implementation. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience8-10 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

Hyderabad

Remote

As a GCP Engineer - Data Engineer, you will be responsible for building and maintaining data pipelines, managing data storage solutions, and ensuring efficient data processing using Google Cloud Platform. You will work closely with data scientists and analysts to support data-driven decision-making within the organization. Responsibilities Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Stay updated with the latest GCP features and best practices. Qualifications Bachelor's degree in Computer Science, Engineering, or related field. Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Skills Google Cloud Platform (GCP) BigQuery Dataflow SQL Python ETL processes Data warehousing Data modeling Apache Beam Cloud Storage

Posted 3 weeks ago

Apply

4.0 - 9.0 years

15 - 20 Lacs

Mumbai

Work from Office

Project Role : Solution Architect Project Role Description : Translate client requirements into differentiated, deliverable solutions using in-depth knowledge of a technology, function, or platform. Collaborate with the Sales Pursuit and Delivery Teams to develop a winnable and deliverable solution that underpins the client value proposition and business case. Must have skills : Cloud Infrastructure Good to have skills : Solution ArchitectureMinimum 12 year(s) of experience is required Educational Qualification : Minimum BE BTech from a reputed university Summary :As a Solution Architect, you will translate client requirements into differentiated, deliverable solutions using in-depth knowledge of a technology, function, or platform. Collaborate with the Sales Pursuit and Delivery Teams to develop a winnable and deliverable solution that underpins the client value proposition and business case. To design and deploy on-premises and public cloud storage and backup infrastructure for large scale technology projects such as data lake, digital platform, and other core business and supporting applications / To design and deploy data replication and disaster recovery solutionsInfrastructure Architect Roles & Responsibilities:- SPOC for infrastructure design and deployment (storage, backup, and data replication) for any designated projects- Take full accountability of design of infrastructure domain including network connectivity to various entities such as on-premises data centers and partner networks - Take ownership of infrastructure design related issues and challenges and drive for solutions working with various internal teams and third-party solution providers such as OEMs and technology partners- Define and develop high level operating procedures for seamless operations of the project- Support transition of projects from deployment to operations- Anchor design and implementation of on-premises and cloud infrastructure in terms of storage, backup, and data replication components- Be a SPOC for all infrastructure initiatives in existing project and able to navigate through the clients landscape to upsell new initiatives in infrastructure space or able to pave ways for upselling value-driven initiatives for the client - Lead the teams across various infrastructure towers such as storage, hyper-converged infrastructure, public cloud storage infrastructure, and thrive for upskilling and cross skilling to rationalize the resources across the towers and across the clients.- Introduce innovative solutions such as automation to increase productivity and improve service delivery quality - Participate in architecture and design review and approval forums to ensure the infrastructure design principles are adhered to for any changes in the existing landscape or any new initiatives being rolled out in the existing landscape- Participate in client account planning and discussions to ensure infrastructure level initiatives are accounted for and issues are escalated to the right leaders for resolution- Build strong relationships with all client stakeholders and Accenture project teams for effective collaboration and outcomes Professional & Technical Skills: - Must have:- Hands-on Architecture and Design skills for storage and backup infrastructure solutions in cloud and on-premises such as hyper-converged infrastructure, storage area network, Network attached storage, and data replication- Certified Architect (Professional) from OEMs such as Dell EMC, NetApp, Commvault- Strong Communication skills- Ability to drive discussions and ideas with clients senior leadership forums- Problem solving skills- Good to have- Infrastructure Consulting and Presales skills Additional Information:- Total IT experience of minimum 15 years; and- Minimum 4 years of experience in design and deployment of public cloud storage and backup infrastructure (at least any one from AWS, Azure, GCP, and OCI)- Minimum 8 years of experience in design and deployment of on-premises infrastructure (Virtualized and Hyper-converged Infra, Storage Area Network, Network Attached Storage, Backup and Recovery Solutions)- This position is based at our Mumbai office.- A Minimum BE BTech from a reputed university is required. Qualification Minimum BE BTech from a reputed university

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Pune

Work from Office

Job Description Key Responsibilities: About the role: The candidate will have strong frontend development expertise, a solid understanding of cloud platforms (especially Google Cloud Platform GCP), and familiarity with card payments or financial domains. You will work closely with cross-functional teams to build modern, scalable, and performant web applications. Design, develop, and maintain user-facing features using modern frontend technologies (React, Angular, Vue, etc.) Ensure high performance, scalability, and responsiveness of UI components Collaborate with backend developers, product managers, and UX designers to implement features and improve user experience Integrate with cloud-based services, especially on GCP (Firebase, Cloud Functions, App Engine, etc.) Write clean, reusable, and maintainable code following best practices Ensure cross-browser and device compatibility Optimize applications for

Posted 3 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a highly experienced Senior Mainframe Storage Consultant with a minimum of 15 years of hands-on experience in managing complex storage environments within large mainframe shops. The ideal candidate will have a proven track record in executing storage migrations between IBM mainframe, Hitachi, and Fujitsu platforms, and possess deep expertise in storage technologies such as DS8K, TS7700, TDMF, and GDPS. This role demands exceptional project management skills and a strong understanding of cloud storage solutions. Responsibilities Provide strategic storage consulting services to clients, assessing their storage infrastructure and recommending optimization strategies. Develop comprehensive storage migration plans, including data mapping, cutover strategies, and risk assessment. Lead and execute complex storage migration projects, ensuring minimal downtime and data loss. Perform in-depth analysis of storage environments to identify performance bottlenecks and recommend solutions. Design and implement storage solutions to meet business requirements and comply with industry standards. Install, configure, and maintain storage hardware and software, including DS8K, TS7700, TDMF, and GDPS. Provide technical leadership and mentorship to team members. Stay up-to-date with emerging storage technologies and industry trends. Collaborate with cross-functional teams to ensure successful project delivery. Required education Bachelor's Degree Required technical and professional expertise Technical Qualifications Minimum of 10+ years of experience in mainframe storage administration and consulting. Proven track record in managing complex storage migrations between IBM mainframe, Hitachi, and Fujitsu platforms. In-depth knowledge of DS8K, TS7700, TDMF, and GDPS storage systems. Hands on Experience on transforming from ISV to IBM products is highly recommended Strong understanding of mainframe operating systems (z/OS). Excellent project management and organizational skills. Strong analytical and problem-solving abilities. Exceptional communication and interpersonal skills. Ability to work independently and as part of a team. Preferred technical and professional experience Preferred Qualifications Certifications in IBM storage technologies. Experience with cloud storage platforms and migration strategies. Knowledge of storage automation and scripting tools. Understanding of ITIL frameworks and methodologies.

Posted 4 weeks ago

Apply

8.0 - 13.0 years

10 - 17 Lacs

Pune

Work from Office

The IBM Storage Protect Support (Spectrum Protect or TSM erstwhile) team is supporting complex integrated storage products end to end, including Spectrum Protect, Spectrum Protect Plus, Copy Data Management. This position involves working with our IBM customers remotely, which are some of the world's top research, automotive, banks, health care and technology providers. The candidates must be able to assist with operating systems (AIX,Linux, Unix, Windows), SAN, network protocols, clouds and storage devices. They will work in a virtual environment working with colleagues around the globe and will be exposed to many different types of technologies. Responsibilitiesmust include but not limited to Provide remote troubleshooting and analysis assistance for usage and configuration questions Review diagnostic information to assist in isolation of a problem cause (which could include assistance interpreting traces and dumps) Identify known defects and fixes to resolve problems Develops best practice articles and support utilities to improve support quality and productivity Respond to escalated customer calls, complaints, and queries The job will require flexible schedule to ensure 24x7 support operations and weekend on-call coverage, including extending/taking shift to cover North America working hours. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Following minimum experience are required for the role - Must have worked in at least 8 - 15 years on data protection or storage software’s as administrator or solution architect or client server technologies. Debugging and analysis are performed via the telephone as well as electronically. So candidates must possess strong customer interaction skills and be able to clearly articulate solutions and options. Must be familiar with and able to interpret complex software problems that span across multiple client and server platforms including UNIX, Linux, AIX, and Windows. Focus on storage area networks (SAN), network protocols, Cloud, and storage devices is preferred. Hands on experience with storage virtualization is a plus. Candidates must be flexible in schedule and availability. Second shift and weekend scheduling will be required. Preferred technical and professional experience Excellent communication skills - both verbal and written Provide remote troubleshooting and analysis assistance for usage and configuration questions Preferred Professional and Technical Expertise: At least 5-10 years of in-depth experience with Spectrum Protect (Storage Protect) or its competition products in data protection domain Working knowledge on RedHat, Openshift or Ansible administration will be preferred. Good in networking and troubleshooting. Cloud Certification will be added advantage. Knowledge about Object Storage and Cloud Storage will be preferred

Posted 4 weeks ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team

Posted 4 weeks ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Looking for Storage Engineer who worked on DELL EMC Storage SAN & NAS, NetApp , Synology , HP , Hitachi , end to end operations on storage Windows Server, Linux), VMWARE required AWS or AZURE Cloud Storage will be preferred

Posted 4 weeks ago

Apply

5.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops

Posted 4 weeks ago

Apply

5.0 - 7.0 years

19 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles. Qualifications: 4+ years of experience in ETL & Data Warehousing Should have excellent leadership & communication skills Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid-level experience in Pyspark and Teradata Should Have mid-level experience in Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. Certifications: Google Cloud Professional Data Engineer Certification. Roles and Responsibilities Analyze the different source systems, profile data, understand, document & fix Data Quality issues Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code and deploy across environments Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests Do production support, enhancements and bug fixes Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem

Posted 4 weeks ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP)Skills Required:GCP , Big Query,, AI/ML Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering/AIML Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies