Jobs
Interviews

288 Cloud Storage Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in operating and further developing applications and functionalities on the cloud platform. Your focus will also extend to regulatory topics surrounding partner/client relationships. We are seeking individuals who can contribute to this contemporary and emerging Cloud application area. Key Responsibilities: - Ensure optimum service level to supported business lines - Oversee resolution of incidents and problems within the team - Assist in managing business stakeholder relationships - Define and manage OLAs with relevant stakeholders - Monitor team performance, adherence to processes, and alignment with business SLAs - Manage escalations and work with relevant functions to resolve issues quickly - Identify areas for improvement and implement best practices in your area of expertise - Mentor and coach Production Management Analysts within the team - Fulfill Service Requests, communicate with Service Desk function, and participate in major incident calls - Document tasks, incidents, problems, changes, and knowledge bases - Improve monitoring of applications and implement automation of tasks Skills and Experience: - Service Operations Specialist experience in a global operations context - Extensive experience supporting complex application and infrastructure domains - Ability to manage and mentor Service Operations teams - Strong ITIL/best practice service context knowledge - Proficiency in interface technologies, communication protocols, and ITSM tools - Bachelor's Degree in IT or Computer Science related discipline - ITIL certification and experience with ITSM tool ServiceNow preferred - Knowledge of Banking domain and regulatory topics - Experience with databases like BigQuery and understanding of Big Data and GCP technologies - Proficiency in tools like GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow - Architectural skills for big data solutions and interface architecture Area-Specific Tasks/Responsibilities: - Handle Incident/Problem Management and Service Request Fulfilment - Analyze and resolve incidents escalated from 1st Level Support - Support the resolution of high-impact incidents and escalate when necessary - Provide solutions for open problems and support service transition for new projects/applications Joining our team, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career progression. We value diversity and promote a positive, fair, and inclusive work environment at Deutsche Bank Group. Visit our company website for more information.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Principal Engineer at our company, you will collaborate with cross-functional teams including development, product management, and sales engineering to establish the solution architecture for data security and data management services along with its features and capabilities. It will be your responsibility to stay up-to-date with emerging trends in data security and data management platforms, applications, technologies, tools, and APIs to enhance and refine the solution architecture for both existing and new products. Apart from this, you will play a crucial role in mentoring, guiding, and training engineers on data management, data security, and the principles of high-quality development. Your expertise and hands-on experience should include designing and implementing SaaS software at scale, proficiency in Python or Golang, deep knowledge and experience in working with cloud storage and data management solutions, and building storage systems for data protection at scale. Any prior experience in developing data protection products would be an added advantage. Familiarity with SaaS platforms like AWS and Azure is also beneficial. As a strong technical leader with excellent communication skills, you are expected to collaborate effectively with diverse technical teams to fulfill the solution architecture requirements of the product and drive architecture, design, and implementation to achieve timely business outcomes. Desirable skills for this role include excellent written and verbal communication, familiarity with Agile methodologies such as Scrum, and experience in Cloud Technologies, preferably AWS. With over 10 years of industry experience in building software products for enterprises, you are required to hold a BTech / B.E / M.E./ MTech (Computer Science) or equivalent degree, with an advanced degree in computer science being a must. Your role will involve working closely with various teams to define solution architecture, assess emerging trends, drive the integration of data management applications, tools, and best practices into the tech stack, and mentor engineers on data security, data management, and high-quality development principles.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Okta is The Worlds Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we're looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We're building a world where Identity belongs to you. The Proposal Coordinator position at Okta involves supporting the writing, editing, and coordination aspects of proposal development. The individual in this role will assist Proposal Managers in answering all aspects of RFx, maintaining the Q&A repository, and crafting executive summaries. This position requires strong writing and communication skills, attention to detail, and the ability to contribute creative ideas. The Proposal Coordinator will collaborate with Proposal Managers and subject matter experts, researching and writing for a dynamic product. This role also includes supporting responses to RFPs, RFQs, and RFIs, maintaining and updating proposal documents, and the RFP data repository. It is important for the Proposal Coordinator to be able to work in USA time zones. As a member of the Presales Team, the Proposal Coordinator will play a key role in planning, researching, developing, and editing complex content for proposals in response to customer solicitations. Responsibilities include writing and editing executive summaries, customer-specific requirements, and initial drafts/first passes on RFx. The Proposal Coordinator will help team members adhere to processes, templates, standards, and methodologies to successfully bid projects in a timely manner, as well as maintain collateral and data repository for professional, consistent, and fast-turnaround of RFx and detailed technical questionnaires. The ability to work cross functionally and track RFx responses in Salesforce is essential. Requirements for this role include a BS/BA Degree or equivalent relevant experience, along with at least 2 years of related experience, preferably in the proposal field and/or technical writing. The ideal candidate should be a self-starter, humble, multitasker, great teammate, and able to work well under pressure. Excellent communication, interpersonal, and organizational skills are necessary, as well as strong writing, grammar, and spelling abilities. Proficiency with Google Workspace, MS Office Suite, and cloud storage is required. The Proposal Coordinator should have the ability to engage with senior technical and upper management, lead cross-functional teams, and have experience with technical or software-related product suite. As a Full-Time Okta employee, you can look forward to amazing benefits, making a social impact, and being part of an organization that fosters Diversity, Equity, Inclusion, and Belonging. Okta provides a dynamic work environment with the best tools, technology, and benefits to empower employees to work productively in a setting that suits their unique needs. If you are ready to find your place at Okta, visit https://www.okta.com/company/careers/. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer with 5+ years of experience, you will be responsible for designing, developing, and maintaining scalable data pipelines using Google Cloud Data Proc and Dataflow tools. Your primary focus will be on processing and analyzing large datasets while ensuring data integrity and accessibility. Your role will require a Bachelor's degree in Computer Science, Information Technology, or a related field. Along with your academic background, you should have a strong technical skill set, including proficiency in Google Cloud Dataflow and Data Proc, along with a solid understanding of SQL and data modeling concepts. Experience with tools like BigQuery, Cloud Storage, and other GCP services will be essential for this position. Additionally, familiarity with programming languages like Python or Java will be advantageous. In addition to your technical expertise, soft skills are equally important for success in this role. You should possess excellent problem-solving abilities, strong communication skills, and a collaborative mindset to work effectively within a team environment. If you are passionate about leveraging GCP tools to process and analyze data, and if you meet the mandatory skills criteria of GCP Data Proc and Dataflow, we encourage you to share your resume with us at gkarthik@softpathtech.com/careers@softpathtech.com. Join our team and contribute to building efficient and reliable data solutions with cutting-edge technologies.,

Posted 2 days ago

Apply

10.0 - 15.0 years

15 - 25 Lacs

Chennai, Bengaluru

Hybrid

Role Overview Were seeking a highly seasoned Solution Architect with deep expertise in Google Cloud Platform (GCP) and a proven track record in designing data and AI infrastructure tailored to AdTech use cases. You'll be pivotal in building scalable, performant, and privacycompliant systems to support realtime bidding, campaign analytics, customer segmentation, and AIdriven personalization. Key Responsibilities Architect and lead GCPnative solutions for AdTech: realtime bidding (RTB/OpenRTB), campaign analytics, lookalike modeling, and audience segmentation. Design highthroughput data pipelines, eventdriven architectures, and unified audience data lakes leveraging GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer (Airflow), Dataplex , Vertex AI / AutoML , Cloud Functions , Cloud Run , GKE , Looker , and Apigee Collaborate with ad ops, marketing, and product stakeholders to translate business goals into architecture roadmaps, lead discovery workshops, solution assessments, and architecture reviews in presales and delivery cycles nexusitgroup.comSmartRecruiters. Integrate with thirdparty AdTech/MarTech platforms including DSPs, SSPs, CDPs, DMPs, ad exchanges , identity graphs, and consent/identity resolution systems (e.g. LiveRamp, The Trade Desk, Google Ads Data Hub). Ensure architecture aligns with GDPR, CCPA , IAB TCF and data privacy regulations—support consent management, anonymization, encryption, and access controls. Lead multidisciplinary technical teams (Data Engineering, MLOps, Analytics), enforce best practices in data governance, CI/CD, and MLOps (via Cloud Build , Terraform, Kubeflow/Vertex AI pipelines). Mentor engineers, run architecture reviews, define governance, cost optimization, security strategy and system observability. Conduct handson prototyping and PoCs to validate AI/ML capabilities, rapid experimentation before fullscale implementation Machine Learning Jobs. Tech Stack Expertise & Qualifications 15+ years in technical architecture, consulting, or senior engineering roles (preferably with majority in data & analytics); at least 5+ years handson with GCP architectures Indepth knowledge and handson experience of: GCP data and analytics stack: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, Dataplex, Cloud Storage AI/ML on GCP: Vertex AI, AI Platform, AutoML , model deployment, inference pipelines Compute frameworks: Cloud Functions , Cloud Run , GKE , Apigee Business intelligence and visualization: Looker Infrastructure as code: Terraform ; CI/CD pipelines: Cloud Build , Git-based workflows Skilled in Python and SQL ; familiarity with Java or Scala is a plus. Experience designing eventdriven architectures, streaming data pipelines, microservices , and APIbased integrations. Proven AdTech domain expertise: programmatic advertising, RTB/OpenRTB, identity resolution, cookieless frameworks, DMPs/CDPs data flows . Proven experience with data governance, encryption, IAM, PII anonymization , privacy-enhancing tech. Strong ability to code prototypes or PoCs to solve client challenges quickly, with high-quality architectural foundations Excellent communication skills, able to clearly present complex designs to both technical and nontechnical audiences.

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migration strategies from on-premises or other cloud environments to GCP. Optimizing data pipelines and query performance to facilitate efficient data processing and analysis will be a key focus area. Additionally, your proven experience in managing teams and project delivery will be essential for success in this position. Collaborating closely with stakeholders to comprehend their requirements and deliver effective solutions will be a significant part of your responsibilities. Any experience with Looker will be considered advantageous for this role.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

At TELUS Digital, you will play a crucial role in enabling customer experience innovation by fostering spirited teamwork, embracing agile thinking, and embodying a caring culture that prioritizes customers. As the global arm of TELUS Corporation, a leading telecommunications service provider in Canada, we specialize in delivering contact center and business process outsourcing solutions to major corporations across various sectors such as consumer electronics, finance, telecommunications, and utilities. With our extensive global call center capabilities, we offer secure infrastructure, competitive pricing, skilled resources, and exceptional customer service, all supported by TELUS, our multi-billion dollar parent company. In this role, you will leverage your expertise in Data Engineering, backed by a minimum of 4 years of industry experience, to drive the success of our projects. Proficiency in Google Cloud Platform (GCP) services including Dataflow, BigQuery, Cloud Storage, and Pub/Sub is essential for effectively managing data pipelines and ETL processes. Your strong command over the Python programming language will be instrumental in performing data processing tasks efficiently. You will be responsible for optimizing data pipeline architectures, enhancing performance, and ensuring reliability through your software engineering skills. Your ability to troubleshoot and resolve complex pipeline issues, automate repetitive tasks, and monitor data pipelines for efficiency and reliability will be critical in maintaining operational excellence. Additionally, your familiarity with SQL, relational databases, and version control systems like Git will be beneficial in streamlining data management processes. As part of the team, you will collaborate closely with stakeholders to analyze, test, and enhance the reliability of GCP data pipelines, Informatica ETL workflows, MDM, and Control-M jobs. Your commitment to continuous improvement, SLA adherence, and post-incident reviews will drive the evolution of our data pipeline systems. Excellent communication, problem-solving, and analytical skills are essential for effectively documenting processes, providing insights, and ensuring seamless operations. This role offers a dynamic environment where you will have the opportunity to work in a 24x7 shift, contributing to the success of our global operations and making a meaningful impact on customer experience.,

Posted 3 days ago

Apply

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,

Posted 3 days ago

Apply

6.0 - 8.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Summary Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth. Software Requirements Required: Proficiency in Data Engineering tools and frameworks such as Hive , Apache Spark , and Python (version 3.x) Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow, BigQuery, Cloud Storage, and Pub/Sub Familiarity with Git , Jira , and Confluence for version control and collaboration Preferred: Experience with additional GCP services like DataProc, Data Studio, or Cloud Composer Exposure to other programming languages such as Java or Scala Knowledge of data security best practices and tools Overall Responsibilities Design, develop, and optimize scalable data pipelines on GCP to support analytics and reporting needs Collaborate with cross-functional teams to translate business requirements into technical solutions Build and maintain data models, ensuring data quality, integrity, and security Participate actively in code reviews, adhering to best practices and standards Develop automated and efficient data workflows to improve system performance Stay updated with emerging data engineering trends and continuously improve technical skills Provide technical guidance and support to team members, fostering a collaborative environment Ensure timely delivery of deliverables aligned with project milestones Technical Skills (By Category) Programming Languages: EssentialPython (required) PreferredJava, Scala Data Management & Databases: Experience with Hive, BigQuery, and relational databases Knowledge of data warehousing concepts and SQL proficiency Cloud Technologies: Extensive hands-on experience with GCP services including Dataflow, BigQuery, Cloud Storage, Pub/Sub, and Composer Ability to build and optimize data pipelines leveraging GCP offerings Frameworks & Libraries: Spark (PySpark preferred), Hadoop ecosystem experience is advantageous Development Tools & Methodologies: Agile/Scrum methodologies, version control with Git, project tracking via JIRA, documentation on Confluence Security Protocols: Understanding of data security, privacy, and compliance standards Experience Requirements Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development Proven experience in designing and implementing data solutions on cloud platforms, particularly GCP Prior experience working in agile teams, participating in code reviews, and delivering end-to-end data projects Experience working with cross-disciplinary teams and understanding varied stakeholder requirements Exposure to industry best practices for data security, governance, and quality assurance is desired Day-to-Day Activities Attend daily stand-up meetings and contribute to project planning sessions Collaborate with business analysts, data scientists, and other stakeholders to understand data needs Develop, test, and deploy scalable data pipelines, ensuring efficiency and reliability Perform regular code reviews, provide constructive feedback, and uphold coding standards Document technical solutions and maintain clear records of data workflows Troubleshoot and resolve technical issues in data processing environments Participate in continuous learning initiatives to stay abreast of technological developments Support team members by sharing knowledge and resolving technical challenges Qualifications Bachelor's or Masters degree in Computer Science, Information Technology, or a related field Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory Demonstrable experience in data engineering and cloud technologies Professional Competencies Strong analytical and problem-solving skills, with a focus on outcome-driven solutions Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders Ability to work independently with minimal supervision and manage multiple priorities effectively Adaptability to evolving technologies and project requirements Demonstrated initiative in driving tasks forward and continuous improvement mindset Strong organizational skills with a focus on quality and attention to detail S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 3 days ago

Apply

1.0 - 3.0 years

1 - 5 Lacs

Kolkata

Work from Office

Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : NetApp Data Management & Cloud Storage Solutions Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Infra Tech Support Practitioner, you will engage in the ongoing technical support and maintenance of production and development systems and software products. Your typical day will involve addressing technical issues, collaborating with various teams, and ensuring that configured services operate smoothly across multiple platforms. You will be responsible for troubleshooting hardware and software problems, implementing technology solutions, and providing support for both remote and onsite operations. Your role will require you to work within defined operating models and processes, ensuring that all systems function optimally and meet the needs of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their technical skills.- Monitor system performance and proactively identify areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in NetApp Data Management & Cloud Storage Solutions.- Strong understanding of cloud storage architectures and data management best practices.- Experience with troubleshooting and resolving issues related to server and network environments.- Familiarity with operating system-level support across various platforms.- Ability to implement and maintain software solutions from different vendors. Additional Information:- The candidate should have minimum 5 years of experience in NetApp Data Management & Cloud Storage Solutions.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

5.0 - 10.0 years

0 Lacs

maharashtra

On-site

You are a highly skilled and motivated Lead Data Scientist / Machine Learning Engineer sought to join a team pivotal in the development of a cutting-edge reporting platform. This platform is designed to measure and optimize online marketing campaigns effectively. Your role will involve focusing on data engineering, ML model lifecycle, and cloud-native technologies. You will be responsible for designing, building, and maintaining scalable ELT pipelines, ensuring high data quality, integrity, and governance. Additionally, you will develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experimenting with different algorithms and leveraging various models will be crucial in driving insights and recommendations. Furthermore, you will deploy and monitor ML models in production and implement CI/CD pipelines for seamless updates and retraining. You will work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translating complex model insights into actionable business recommendations and presenting findings to stakeholders will also be part of your responsibilities. Qualifications & Skills: Educational Qualifications: - Bachelors or Masters degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or related field. - Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: - Experience: 5-10 years with the mentioned skillset & relevant hands-on experience. - Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). - ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. - Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. - Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. - MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). - Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: - Experience with Graph ML, reinforcement learning, or causal inference modeling. - Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. - Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. - Experience with distributed computing frameworks (Spark, Dask, Ray). Location: - Bengaluru Brand: - Merkle Time Type: - Full time Contract Type: - Permanent,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a member of the Customer Success Services (CSS) organization at Oracle, your primary focus will be delivering post-sales support and solutions to Oracle customers while advocating for customer needs. Your role involves managing and supporting customer environments in the Oracle Cloud Infrastructure (OCI), ensuring optimal performance, availability, and security. You will be responsible for resolving technical issues, performing system monitoring, and collaborating with internal teams to implement best practices. Additionally, you will engage with customers to understand their requirements, provide training, and deliver exceptional customer service. This position demands strong problem-solving skills, technical proficiency in OCI, and a commitment to enhancing customer satisfaction. Serving as the primary point of contact for customers, you will facilitate customer relationships with Support and provide advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. As an individual contributor at Career Level IC4, your responsibilities will include: - Managing and supporting customer environments in OCI cloud. - Designing well-architected cloud deployment designs in OCI following best practices principles and guidelines. - Implementing automated configuration management and infrastructure provisioning. - Communicating with corporate customers through various channels regarding technical problems in Oracle Cloud products. - Managing customer escalations and ensuring timely delivery of high-quality resolutions with a focus on root cause analysis and prevention. - Developing and implementing proactive support strategies to reduce incidents, increase availability, and accelerate deployments. - Utilizing Oracle resources to advise and consult on the use of Oracle products to prevent future problems. - Educating customers and guiding them through the problem-solving process. - Following Oracle diagnostic methodology and procedures while handling and documenting technical issues in compliance with Support processes, policies, and SLAs. - Collaborating on cross-team and cross-product technical issues by working with resources from other groups. - Researching and staying updated on product features, new releases, functionalities, and related technologies to maintain product expertise. The ideal candidate for this role should possess: - 6+ years of relevant Cloud IAAS & PASS experience, preferably in OCI, and effective communication skills. - 6+ years of overall experience in any domain, preferably Database, system, or network administration. - Experience in Cloud Database Services such as OCI VMDB, EXACC, EXACS, ADB, ADW, MYSQL, and NOSQL DB. - Proven experience in implementing, monitoring, and maintaining Cloud solutions (AWS, AZURE, OCI). - Proficiency in Cloud Compute, IAM and IDCS, Networking, Storage, Security, Observability and Management, IaaC tools, and other relevant areas. - Ability to understand business requirements and map them to proposed solutions/enhancements. - Skills in driving performance issues and resolving complex architecture problems. - Knowledge of OIC management of Oracle integrations and exposure to multiple clouds like AWS, AZURE, GCP. - Relevant certifications such as OCI Architect Associate, OCI Architect/Operational Professional Certification, AWS Professional Architect, or Azure Cloud Architect. Oracle, a global leader in cloud solutions, leverages cutting-edge technology to address current challenges and partner with industry leaders across sectors. With a legacy of over 40 years, Oracle operates with integrity and commitment to innovation. We value diversity and inclusion in our workforce, fostering equal opportunities for all employees. At Oracle, we prioritize work-life balance, offer competitive benefits, flexible medical, life insurance, and retirement options, and encourage community engagement through volunteer programs. We are dedicated to inclusivity, ensuring that individuals with disabilities are included in all stages of the employment process. If you require accessibility assistance or accommodation due to a disability, please reach out to us at accommodation-request_mb@oracle.com or call +1 888 404 2494 in the United States.,

Posted 5 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. We are seeking an experienced and highly skilled Senior Google Cloud Analytics & Vertex AI Specialist for the position of Associate Director with 12-15 years of experience, specifically focusing on Google Vertex AI. The ideal candidate will have a deep understanding of Google Cloud Platform (GCP) and extensive hands-on experience with Google Cloud analytics services and Vertex AI. The role involves leading projects, designing scalable data solutions, driving the adoption of AI and machine learning practices within the organization, and supporting pre-sales activities. A minimum of 2 years of hands-on experience with Vertex AI is required. Key Responsibilities: - Architect and Implement: Design and implement end-to-end data analytics solutions using Google Cloud services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. - Vertex AI Development: Develop, train, and deploy machine learning models using Vertex AI. Utilize Vertex AI's integrated tools for model monitoring, versioning, and CI/CD pipelines. Implement custom machine learning pipelines using Vertex AI Pipelines. Utilize Vertex AI Feature Store for feature management and Vertex AI Model Registry for model tracking. - Data Integration: Integrate data from various sources, ensuring data quality and consistency across different systems. - Performance Optimization: Optimize data pipelines and analytics processes for maximum efficiency and performance. - Leadership and Mentorship: Lead and mentor a team of data engineers and data scientists, providing guidance and support on best practices in GCP and AI/ML. - Collaboration: Work closely with stakeholders to understand business requirements and translate them into technical solutions. - Innovation: Stay updated with the latest trends and advancements in Google Cloud services and AI technologies, advocating for their adoption when beneficial. - Pre-Sales Support: Collaborate cross-functionally to understand client requirements, design tailored solutions, prepare and deliver technical presentations and product demonstrations, and assist in proposal and RFP responses. - Project Delivery: Manage and oversee the delivery of data analytics and AI/ML projects, ensuring timely and within budget completion while coordinating with cross-functional teams. Qualifications: - Experience: 12-15 years in data engineering, data analytics, and AI/ML with a focus on Google Cloud Platform. - Technical Skills: Proficient in Google Cloud services (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI), strong programming skills in Python and SQL, experience with machine learning frameworks (TensorFlow, PyTorch), data visualization tools (Looker, Data Studio). - Pre-Sales and Delivery Skills: Experience in supporting pre-sales activities, managing and delivering complex data analytics and AI/ML projects. - Certifications: Google Cloud Professional Data Engineer or Professional Machine Learning Engineer certification is a plus. - Soft Skills: Excellent problem-solving, communication, and leadership skills. Qualifications: - B.E./B.Tech/Post Graduate,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Pipeline Architect at our company, you will be responsible for designing, developing, and maintaining optimal data pipeline architecture. You will monitor incidents, perform root cause analysis, and implement appropriate actions to ensure smooth operations. Additionally, you will troubleshoot issues related to abnormal job execution and data corruption, and automate jobs, notifications, and reports for efficiency. Your role will also involve optimizing existing queries, reverse engineering for data research and analysis, and calculating the impact of issues on downstream processes for effective communication. You will support failures, address data quality issues, and ensure the overall health of the environment. Maintaining ingestion and pipeline runbooks, portfolio summaries, and DBAR will be part of your responsibilities. Furthermore, you will enable infrastructure changes, enhancements, and updates roadmap, and build the infrastructure for optimal extraction, transformation, and loading of data from various sources using big data technologies, python, or Web-based APIs. Conducting and participating in code reviews with peers, ensuring effective communication, and understanding requirements will be essential in this role. To qualify for this position, you should hold a Bachelor's degree in Engineering/Computer Science or a related quantitative field. You must have a minimum of 8 years of programming experience with python and SQL, as well as hands-on experience with GCP, BigQuery, Dataflow, Data Warehousing, Apache Beam, and Cloud Storage. Experience with massively parallel processing systems like Spark or Hadoop, source code control systems (GIT), and CI/CD processes is required. Involvement in designing, prototyping, and delivering software solutions within the big data ecosystem, developing generative AI models, and ensuring code quality through reviews are key aspects of this role. Experience with Agile development methodologies, improving data governance and quality, and increasing data reliability are also important. Joining our team at EXL Analytics offers you the opportunity to work in a dynamic and innovative environment alongside experienced professionals. You will gain insights into various business domains, develop teamwork and time-management skills, and receive training in analytics tools and techniques. Our mentoring program and growth opportunities ensure that you have the support and guidance needed to excel in your career. Sky is the limit for our team members, and the experiences gained at EXL Analytics pave the way for personal and professional development within our company and beyond.,

Posted 5 days ago

Apply

5.0 - 10.0 years

8 - 9 Lacs

Noida

Work from Office

Job Description: We are seeking a highly skilled Ericsson Cloud SME to join our Managed Services Cloud Operations team. The ideal candidate will have strong expertise in Ericsson Cloud platforms , OpenStack/Kubernetes , Networking , Cloud Storage , and Automation . The role involves hands-on management of Ericsson NFVI components , including CEE 9, CEE 10, CNIS, ECCD, SDI, vCIC, Fuel, and RTE , with a strong focus on networking, security, and automation in a mission-critical, 24x7 environment . What you will do: Ericsson Cloud Infrastructure Management : Operate and optimize Ericsson NFVI components , including CEE 9, CEE 10, CNIS, ECCD, SDI, vCIC, Fuel, and RTE . Ensure seamless performance, scalability, and availability of Ericsson Cloud Execution Environment (CEE) . Troubleshoot complex issues in Ericsson SDN (ODL), OpenStack, and Kubernetes environments. Cloud & Networking Expertise : Manage networking for Juniper, Pluribus, Extreme SLX switches/routers . Configure and troubleshoot F5, A10 Load Balancers , and Juniper, FortiGate Firewalls . Work with Pluribus Networks and Ericsson SDN (ODL) for advanced SDN-based cloud networking . Kubernetes & OpenStack Operations : Deploy and manage Kubernetes (RedHat OpenShift, Mirantis Kubernetes Engine) and OpenStack-based cloud environments. Handle containerized workloads with Docker, Kubernetes, and RedHat OpenShift . Optimize cloud platforms for high availability, scalability, and security . The Skill you Bring: Bachelor s degree in Computer Science, IT, Telecommunications, or equivalent experience. 5+ years of hands-on experience in Ericsson Cloud, NFVI, SDN, OpenStack, and Kubernetes operations . Strong expertise in Ericsson Cloud Execution Environment (CEE 9, CEE 10), CNIS, ECCD, SDI, vCIC, Fuel, and RTE . In-depth knowledge of Juniper, Pluribus, Extreme SLX switches/routers, F5/A10 Load Balancers, and Juniper/FortiGate Firewalls . Experience with Ericsson SDN (ODL), Kubernetes (RedHat OpenShift, Mirantis Kubernetes Engine) . Hands-on expertise in Cloud Storage (Nexenta, Ceph) and Hyperscale Cloud Hardware (Ericsson HDS8000, BSP8100, Dell, HPE) . Strong scripting knowledge in Bash, Python, Ansible for automation. Primary country and city: India (IN) || Bangalore Req ID: 770555

Posted 5 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Noida

Work from Office

Join our Team Job Description: We are seeking a highly skilled Ericsson Cloud SME to join our Managed Services Cloud Operations team. The ideal candidate will have strong expertise in Ericsson Cloud platforms , OpenStack/Kubernetes , Networking , Cloud Storage , and Automation . The role involves hands-on management of Ericsson NFVI components , including CEE 9, CEE 10, CNIS, ECCD, SDI, vCIC, Fuel, and RTE , with a strong focus on networking, security, and automation in a mission-critical, 24x7 environment . What you will do: Ericsson Cloud Infrastructure Management : Operate and optimize Ericsson NFVI components , including CEE 9, CEE 10, CNIS, ECCD, SDI, vCIC, Fuel, and RTE . Ensure seamless performance, scalability, and availability of Ericsson Cloud Execution Environment (CEE) . Troubleshoot complex issues in Ericsson SDN (ODL), OpenStack, and Kubernetes environments. Cloud & Networking Expertise : Manage networking for Juniper, Pluribus, Extreme SLX switches/routers . Configure and troubleshoot F5, A10 Load Balancers , and Juniper, FortiGate Firewalls . Work with Pluribus Networks and Ericsson SDN (ODL) for advanced SDN-based cloud networking . Kubernetes & OpenStack Operations : Deploy and manage Kubernetes (RedHat OpenShift, Mirantis Kubernetes Engine) and OpenStack-based cloud environments. Handle containerized workloads with Docker, Kubernetes, and RedHat OpenShift . Optimize cloud platforms for high availability, scalability, and security . The Skill you Bring: Bachelor s degree in Computer Science, IT, Telecommunications, or equivalent experience. 5+ years of hands-on experience in Ericsson Cloud, NFVI, SDN, OpenStack, and Kubernetes operations . Strong expertise in Ericsson Cloud Execution Environment (CEE 9, CEE 10), CNIS, ECCD, SDI, vCIC, Fuel, and RTE . In-depth knowledge of Juniper, Pluribus, Extreme SLX switches/routers, F5/A10 Load Balancers, and Juniper/FortiGate Firewalls . Experience with Ericsson SDN (ODL), Kubernetes (RedHat OpenShift, Mirantis Kubernetes Engine) . Hands-on expertise in Cloud Storage (Nexenta, Ceph) and Hyperscale Cloud Hardware (Ericsson HDS8000, BSP8100, Dell, HPE) . Strong scripting knowledge in Bash, Python, Ansible for automation. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 770555

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 6 days ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 6 days ago

Apply

5.0 - 7.0 years

5 - 14 Lacs

Pune, Gurugram, Bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 6 days ago

Apply

6.0 - 9.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Design, deploy, and manage scalable infrastructure on GCP . Develop automation scripts and tools using Python to streamline cloud operations and monitoring. Build and maintain CI/CD pipelines using tools like Jenkins, GitLab CI, or Cloud Build. Manage GCP services such as Compute Engine, Cloud Storage, Cloud Functions, GKE, IAM, and VPC . Implement infrastructure as code using Terraform or Deployment Manager . Monitor system performance, troubleshoot issues, and ensure security best practices. Collaborate with development teams to optimize deployment and release processes. Devops, Gcp, Python

Posted 6 days ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics

Posted 6 days ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Country India Number of Openings* 1 Approved ECMS RQ# 533573 Duration of contract* 6 Months Total Yrs. of Experience* 8+ years Relevant Yrs. of experience* 8+ years Detailed JD *(Roles and Responsibilities) We are looking for a seasoned GCP Engineer with 8 10 years of experience in cloud infrastructure and automation. The ideal candidate will hold a GCP Architecture Certification and possess deep expertise in Terraform, GitLab, Shell Scripting, and a wide range of GCP services including Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. You will be responsible for designing, implementing, and maintaining scalable cloud solutions that meet business and technical requirements. Key Responsibilities: > Design and implement secure, scalable, and highly available cloud infrastructure on Google Cloud Platform. > Automate infrastructure provisioning and configuration using Terraform. > Manage CI/CD pipelines using GitLab for efficient deployment and integration. > Develop and maintain Shell scripts for automation and system management tasks. > Utilize GCP services such as Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM to support data and application workflows. > Ensure compliance with security policies and manage access controls using IAM. > Monitor system performance and troubleshoot issues across cloud environments. > Collaborate with cross-functional teams to understand requirements and deliver cloud-based solutions. Required Skills & Qualifications: > 8 12 years of experience in cloud engineering or infrastructure roles. > GCP Architecture Certification is mandatory. > Strong hands-on experience with Terraform and infrastructure-as-code practices. > Proficiency in GitLab for version control and CI/CD. > Solid experience in Shell Scripting for automation. > Deep understanding of GCP services: Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. > Strong problem-solving skills and ability to work independently. > Excellent communication and colloboration skills. Mandatory skills* SQL, SQL Server, BIG Query, , SSIS Desired skills* Data Modelling, ETL Domain* Payments Client name (for internal purpose only)* NatWest Approx. vendor billing rate(INR /Day) 10000 INR/Day Work Location* Chennai or Bangalore or Gurgaon Background check process to be followed: * Yes Before onboarding / After onboarding: * Before Onboarding BGV Agency: * Any Nascom approved Mode of Interview: Telephonic/Face to Face/Skype Interview* Teams virtual followed by F2F WFO / WFH / Hybrid Hybrid Any Certification (Mandatory) As virtual followed by A2A Shift Time Chennai or Bangalore or Gurgaon Business travel required (Yes / No) No Client BTP / SHTP UK

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Storage Systems Administrator at Ameriprise India LLP, you will be responsible for performing moderately difficult and independent assignments related to troubleshooting, problem diagnosis, and problem resolution for various technologies. Your role will involve implementing and configuring storage systems such as storage area networks (SANs), network-attached storage (NAS), and cloud storage solutions. You will actively collaborate with team members and external partners to address and resolve incidents and problems efficiently. Monitoring storage performance and capacity, optimizing storage resources for optimal performance and cost-effectiveness, troubleshooting storage-related issues, and conducting root cause analysis will be key aspects of your responsibilities. Additionally, you will contribute to refining processes, policies, and best practices to enhance the performance and availability of technologies. Documenting troubleshooting guides, support manuals, and communication plans, as well as developing and implementing data backup and disaster recovery plans will also be part of your role. In terms of continuous improvement, you will work with application teams to assess the impact of application changes on monitoring configurations and implement necessary changes. Collaborating with your manager and team members, you will leverage your experience, expertise, and data analysis skills to identify corrective actions that enhance efficiency, improve performance, and meet targets. To excel in this role, you are required to have a solid understanding of SAN and NAS concepts and protocols, along with operational knowledge of storage environments. Possessing a PURE Storage certification is essential, and AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified SysOps Administrator) would be advantageous. Experience in storage administration and configuration, as well as exposure to tools like Microsoft PowerBI, Tableau, or similar data manipulation tools, will be beneficial. Basic experience in managing Windows and Linux systems, familiarity with ITIL framework, and proficiency in MS Excel are also desired. Join Ameriprise India LLP, a reputable financial planning company with a global presence and a focus on asset management, retirement planning, and insurance protection. Embrace an inclusive and collaborative culture where your contributions are valued, and work alongside talented individuals who share your dedication to excellence. If you are a motivated professional seeking to work for an ethical company that values its employees, consider building your career at Ameriprise India LLP. Please note that this is a full-time position based in the India Business Unit (AWMPO AWMP&S President's Office), with working hours from 4:45 pm to 1:15 am. # Responsibilities: - Perform troubleshooting, problem diagnosis, and resolution for various technologies. - Implement and configure storage systems, including SANs, NAS, and cloud solutions. - Collaborate with team members and external partners to address incidents and problems. - Monitor storage performance and capacity, optimize resources, and troubleshoot issues. - Contribute to refining processes, policies, and best practices for technology performance. - Document troubleshooting guides, support manuals, and communication plans. - Develop and implement data backup and disaster recovery plans. - Work with application teams to assess monitoring impacts of application changes. - Identify corrective actions to enhance efficiency and meet performance targets. # Required Qualifications: - Solid understanding of SAN & NAS concepts and protocols. - PURE Storage certification. - Ability to work collaboratively and communicate effectively. - AWS certifications are a plus. - Experience in storage administration and configuration. - Exposure to data manipulation tools like PowerBI or Tableau. - Basic experience in managing Windows and Linux systems. - Familiarity with ITIL framework and proficiency in MS Excel. - Understanding of monitoring tools and techniques for problem determination and prevention.,

Posted 1 week ago

Apply

Exploring Cloud Storage Jobs in India

The cloud storage job market in India is rapidly growing as more companies are adopting cloud technology for their data storage needs. This has created numerous job opportunities for individuals with expertise in cloud storage solutions. In this article, we will explore the top hiring locations, average salary range, career path, related skills, and interview questions for job seekers interested in pursuing a career in cloud storage in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Mumbai
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industries and have a high demand for cloud storage professionals.

Average Salary Range

The average salary range for cloud storage professionals in India varies based on experience and expertise. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

A typical career path in cloud storage may involve roles such as Cloud Engineer, Cloud Architect, Cloud Developer, and Cloud Solutions Architect. The progression often follows a path from Junior Developer to Senior Developer to Tech Lead, with opportunities to specialize in specific cloud platforms or technologies along the way.

Related Skills

In addition to expertise in cloud storage solutions, professionals in this field are often expected to have skills in areas such as:

  • DevOps
  • Networking
  • Security
  • Programming languages (e.g., Python, Java)
  • Database management

Interview Questions

  • What is a virtual machine and how is it used in cloud computing? (basic)
  • Explain the difference between public, private, and hybrid clouds. (medium)
  • How do you ensure data security in a cloud storage environment? (medium)
  • What are the key benefits of using cloud storage over traditional on-premise storage solutions? (basic)
  • Can you explain the concept of scalability in cloud storage? (medium)
  • How do you handle data migration in a cloud storage environment? (medium)
  • What is the difference between SaaS, PaaS, and IaaS? (basic)
  • How do you monitor and optimize cloud storage performance? (medium)
  • Explain the concept of data redundancy in cloud storage. (basic)
  • How do you handle disaster recovery in a cloud storage environment? (medium)
  • What are the key challenges of cloud storage adoption for businesses? (medium)
  • How do you ensure compliance with data protection regulations in cloud storage? (medium)
  • What is the role of a CDN (Content Delivery Network) in cloud storage? (medium)
  • How do you handle cloud storage cost optimization for a company? (medium)
  • Explain the concept of multi-tenancy in cloud storage. (basic)
  • How do you ensure data integrity in a cloud storage environment? (medium)
  • What are the key considerations for data encryption in cloud storage? (medium)
  • How do you handle data synchronization in a multi-cloud environment? (advanced)
  • Can you explain the concept of serverless computing in cloud storage? (advanced)
  • How do you address the challenges of data sovereignty in a global cloud storage setup? (advanced)
  • What are the key differences between block, file, and object storage in the cloud? (medium)
  • How do you handle data archival and retrieval in a cloud storage environment? (medium)
  • Explain the concept of data deduplication in cloud storage. (medium)
  • How do you handle data access control and permissions in a cloud storage environment? (medium)
  • What are the key considerations for data backup and recovery in cloud storage? (medium)

Closing Remark

As you explore opportunities in cloud storage jobs in India, remember to continuously enhance your skills, stay updated on industry trends, and prepare confidently for interviews. With the right combination of expertise and preparation, you can build a successful career in this rapidly growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies