Jobs
Interviews

474 Cloud Storage Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

As a Senior Architect - Data & Cloud at our company, you will be responsible for architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. You will need to have more than 15 years of experience in Technical, Solutioning, and Analytical roles, with 5+ years specifically in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions on Cloud platforms like GCP, AWS, or Azure. Key Responsibilities: - Translate business requirements into functional and non-functional areas, defining boundaries in terms of Availability, Scalability, Performance, Security, and Resilience. - Architect and design scalable data warehouse solutions on cloud platforms like Big Query or Redshift. - Work with various Data Integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Experience in using traditional ETL tools like Informatica, DataStage, OWB, Talend, etc. - Collaborate with internal and external stakeholders to design optimized data analytics solutions. - Mentor young talent within the team and contribute to building assets and accelerators. Qualifications Required: - 14-20 years of relevant experience in the field. - Strong understanding of Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design. - Experience with BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of Security features and Policies in Cloud environments like GCP, AWS, or Azure. - Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. In this role, you will lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. You will interface with multiple stakeholders within IT and business to understand data requirements and take complete responsibility for the successful delivery of projects. Additionally, you will have the opportunity to work in a high-growth startup environment, contribute to the digital transformation journey of customers, and collaborate with a diverse and proactive team of techies. Please note that flexible, remote working options are available to foster productivity and work-life balance.,

Posted 1 day ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

bengaluru

Work from Office

Job Summary NetApp is a cloud-led, data-centric software company that helps organizations put data to work in applications that elevate their business. We help organizations unlock the best of cloud technology. In this role you will be developing a Cloud Orchestrator and help accelerate the leadership of NetApp in Hybrid Cloud globally. We expect you to be an excellent coder who will take a lead in design and implementation as per the requirements of a managed Cloud Se rvices, and should be able to quickly learn the existing code & architecture. Job Requirements Proficient with Golang, experience with Python, Java/C-sharp is added advantage. Hands-on Expertise in Container based technologies preferably Kubernetes & Dockers. Thorough understanding and extensive experience with building orchestration on at least one of the major hyper-scaler cloud providers (AWS, Microsoft Azure, Google Cloud Platform) Experienced with Cloud service APIs (e.g. AWS, Azure or GCP) Experience working in SMB, NFS & internet security protocols Expertise in REST API design and implementation Thorough understanding of Linux or other Unix-like Operating Systems Should own the deliverables from end-end, design, implement, test automation to ensure high quality deliverable Experience of CI build systems or automated testing Understands the design, architecture principles and best practices in Cloud along with cloud health monitoring, capacity metering, billing. Highly knowledgeable in infrastructure like hypervisor, Cloud Storage and experience with cloud services including Databases, Caching, Object and Block Storage, Scaling, Load Balancers, Networking etc. Highly motivated and self-driven. Education Minimum 5 years of experience and must be hands-on with coding. B.E/B.Tech or M.S in Computer Science or related technical field

Posted 2 days ago

Apply

2.0 - 7.0 years

0 - 9 Lacs

ahmedabad

Work from Office

Join Perceptive Labs as our foundation engineer to build APIs, multi-tenant infra, and AI agents (Python/TS, Postgres, Redis, Temporal, LLMs). High ownership, startup pace, salary. Turn your vision into reality and learn while doing it

Posted 2 days ago

Apply

8.0 - 13.0 years

15 - 22 Lacs

hyderabad, bengaluru

Hybrid

About the Role The Cloud Architect is responsible for designing, implementing, and optimizing enterprise cloud solutions across AWS and hybrid environments. This role involves defining cloud strategy, ensuring scalability, security, and compliance, and driving innovation through cloud-native solutions. The Cloud Architect will also provide technical leadership to projects, mentor teams, and collaborate with stakeholders to align cloud initiatives with business objectives. Key Responsibilities Cloud Strategy & Architecture Design secure, scalable, and cost-optimized cloud architectures across AWS and hybrid environments. Define and implement cloud adoption strategies, governance models, and best practices. Evaluate emerging cloud technologies and recommend adoption to drive digital transformation. Solution Design & Implementation Architect solutions for cloud infrastructure, networking, storage, and security to support enterprise workloads. Develop frameworks for automation, orchestration, and DevOps practices using tools such as Terraform, PowerShell, Power Automate, and CI/CD pipelines. Integrate Office 365, Azure AD, and other enterprise applications with cloud platforms. Security, Compliance & Governance Design and implement cloud security controls (IAM, WAF, Firewall, VPC, NSG, encryption, etc.) to meet compliance and regulatory requirements. Implement compliance frameworks, eDiscovery, data retention policies, and security monitoring solutions. Conduct vulnerability assessments and guide remediation strategies. Performance & Optimization Establish monitoring frameworks to ensure high availability, disaster recovery, and performance optimization. Analyze usage trends, generate performance reports, and recommend improvements for efficiency and cost optimization. Design backup and restore strategies ensuring data resilience. Collaboration & Leadership Provide technical leadership and guidance to engineering and support teams. Conduct knowledge-sharing sessions and training programs to improve adoption of cloud solutions. Collaborate with business stakeholders to align cloud initiatives with organizational goals. Support audits and ensure compliance with industry standards. What Youll Need Strong expertise in AWS architecture (security, networking, automation, storage, and DevOps). Hands-on experience designing and managing cloud security (IAM, firewall, monitoring, encryption). In-depth knowledge of Azure AD, Active Directory, backup strategies, and hybrid cloud environments . Experience integrating Office 365 and enterprise applications with cloud infrastructure. Ability to design scalable, resilient, and secure solutions with focus on automation and optimization. Excellent problem-solving skills and ability to troubleshoot complex cloud environments. Strong communication skills with the ability to engage stakeholders and mentor technical teams.

Posted 2 days ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Keywords :dataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 2 days ago

Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Principal Architect - Data & Cloud at Quantiphi, you will be responsible for leveraging your extensive experience in technical, solutioning, and analytical roles to architect and design end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. You will play a crucial role in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure. Your expertise will be instrumental in designing scalable data warehouse solutions on Big Query or Redshift and working with various data integration, storage, and pipeline tools on Cloud. Additionally, you will serve as a trusted technical advisor to customers, lead multiple data engagements on GCP Cloud, and contribute to the development of assets and accelerators. Key Responsibilities: - Possess more than 15 years of experience in technical, solutioning, and analytical roles - Have 5+ years of experience in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Ability to understand business requirements, translate them into functional and non-functional areas, and define boundaries in terms of availability, scalability, performance, security, and resilience - Architect, design, and implement end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets - Work with distributed computing and enterprise environments like Hadoop and Cloud platforms - Proficient in various data integration and ETL technologies on Cloud such as Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. - Deep knowledge of Cloud and On-Premise databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. - Exposure to No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. - Design scalable data warehouse solutions on Cloud with tools like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. - Experience with Machine Learning Frameworks like TensorFlow, Pytorch - Understand Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Good understanding of BI Reporting and Dashboarding tools like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. - Knowledge of security features and policies in Cloud environments like GCP, AWS, Azure - Work on business transformation projects for moving On-Premise data solutions to Cloud platforms - Serve as a trusted technical advisor to customers and solutions for complex Cloud and Data-related technical challenges - Be a thought leader in architecture design and development of cloud data analytics solutions - Liaise with internal and external stakeholders to design optimized data analytics solutions - Collaborate with SMEs and Solutions Architects from leading cloud providers to present solutions to customers - Support Quantiphi Sales and GTM teams from a technical perspective in building proposals and SOWs - Lead discovery and design workshops with potential customers globally - Design and deliver thought leadership webinars and tech talks with customers and partners - Identify areas for productization and feature enhancement for Quantiphi's product assets Qualifications Required: - Bachelor's or Master's degree in Computer Science, Information Technology, or related field - 14-20 years of experience in technical, solutioning, and analytical roles - Strong expertise in building and managing data lakes, data warehouses, data integration, and business intelligence/artificial intelligence solutions on Cloud platforms like GCP, AWS, and Azure - Proficiency in various data integration, ETL technologies on Cloud, and Cloud and On-Premise databases - Experience with Cloud solutions for IaaS, PaaS, SaaS, Containers, and Microservices Architecture and Design - Knowledge of BI Reporting and Dashboarding tools and security features in Cloud environments Additional Company Details: While technology is the heart of Quantiphi's business, the company attributes its success to its global and diverse culture built on transparency, diversity, integrity, learning, and growth. Working at Quantiphi provides you with the opportunity to be part of a culture that encourages innovation, excellence, and personal growth, fostering a work environment where you can thrive both professionally and personally. Joining Quantiphi means being part of a dynamic team of tech enthusiasts dedicated to translating data into tangible business value for clients. Flexible remote working options are available to promote productivity and work-life balance. ,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

telangana

On-site

As a Data Engineer at our company, you will play a crucial role in managing and optimizing data processes. Your responsibilities will include: - Designing and developing data pipelines using Python programming - Leveraging GCP services such as Dataflow, Dataproc, BigQuery, Cloud Storage, and Cloud Functions - Implementing data warehousing concepts and technologies - Performing data modeling and ETL processes - Ensuring data quality and adhering to data governance principles To excel in this role, you should possess the following qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field - 5-7 years of experience in data engineering - Proficiency in Python programming - Extensive experience with GCP services - Familiarity with data warehousing and ETL processes - Strong understanding of SQL and database technologies - Experience in data quality and governance - Excellent problem-solving and analytical skills - Strong communication and collaboration abilities - Ability to work independently and in a team environment - Familiarity with version control systems like Git If you are looking to join a dynamic team and work on cutting-edge data projects, this position is perfect for you.,

Posted 3 days ago

Apply

8.0 - 13.0 years

10 - 17 Lacs

pune

Work from Office

The IBM Storage Protect Support (Spectrum Protect or TSM erstwhile) team is supporting complex integrated storage products end to end, including Spectrum Protect, Spectrum Protect Plus, Copy Data Management. This position involves working with our IBM customers remotely, which are some of the world's top research, automotive, banks, health care and technology providers. The candidates must be able to assist with operating systems (AIX,Linux, Unix, Windows), SAN, network protocols, clouds and storage devices. They will work in a virtual environment working with colleagues around the globe and will be exposed to many different types of technologies. Responsibilitiesmust include but not limited to Provide remote troubleshooting and analysis assistance for usage and configuration questions Review diagnostic information to assist in isolation of a problem cause (which could include assistance interpreting traces and dumps) Identify known defects and fixes to resolve problems Develops best practice articles and support utilities to improve support quality and productivity Respond to escalated customer calls, complaints, and queries The job will require flexible schedule to ensure 24x7 support operations and weekend on-call coverage, including extending/taking shift to cover North America working hours. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Following minimum experience are required for the role - Must have worked in at least 8 - 15 years on data protection or storage software’s as administrator or solution architect or client server technologies. Debugging and analysis are performed via the telephone as well as electronically. So candidates must possess strong customer interaction skills and be able to clearly articulate solutions and options. Must be familiar with and able to interpret complex software problems that span across multiple client and server platforms including UNIX, Linux, AIX, and Windows. Focus on storage area networks (SAN), network protocols, Cloud, and storage devices is preferred. Hands on experience with storage virtualization is a plus. Candidates must be flexible in schedule and availability. Second shift and weekend scheduling will be required. Preferred technical and professional experience Excellent communication skills - both verbal and written Provide remote troubleshooting and analysis assistance for usage and configuration questions Preferred Professional and Technical Expertise: At least 5-10 years of in-depth experience with Spectrum Protect (Storage Protect) or its competition products in data protection domain Working knowledge on RedHat, Openshift or Ansible administration will be preferred. Good in networking and troubleshooting. Cloud Certification will be added advantage. Knowledge about Object Storage and Cloud Storage will be preferred

Posted 3 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

chennai

Work from Office

Role Purpose The purpose of the role is to resolve, maintain and manage client’s software/ hardware/ network based on the service requests raised from the end-user as per the defined SLA’s ensuring client satisfaction Do Ensure timely response of all the tickets raised by the client end user Service requests solutioning by maintaining quality parameters Act as a custodian of client’s network/ server/ system/ storage/ platform/ infrastructure and other equipment’s to keep track of each of their proper functioning and upkeep Keep a check on the number of tickets raised (dial home/ email/ chat/ IMS), ensuring right solutioning as per the defined resolution timeframe Perform root cause analysis of the tickets raised and create an action plan to resolve the problem to ensure right client satisfaction Provide an acceptance and immediate resolution to the high priority tickets/ service Installing and configuring software/ hardware requirements based on service requests 100% adherence to timeliness as per the priority of each issue, to manage client expectations and ensure zero escalations Provide application/ user access as per client requirements and requests to ensure timely solutioning Track all the tickets from acceptance to resolution stage as per the resolution time defined by the customer Maintain timely backup of important data/ logs and management resources to ensure the solution is of acceptable quality to maintain client satisfaction Coordinate with on-site team for complex problem resolution and ensure timely client servicing Review the log which Chat BOTS gather and ensure all the service requests/ issues are resolved in a timely manner Deliver NoPerformance ParameterMeasure1. 100% adherence to SLA/ timelines Multiple cases of red time Zero customer escalation Client appreciation emails Mandatory Skills: Cloud Storage. Experience3-5 Years.

Posted 3 days ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

bengaluru

Hybrid

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Bengaluru Work Mode: Hybrid (2-3 days office in a week) Full time Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 3 days ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Job Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years" experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What youll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learnon one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 3 days ago

Apply

8.0 - 10.0 years

0 - 2 Lacs

chennai

Hybrid

Total experience: 8+ years of experience Relevant experience: 5+ years of experience Notice Period: Immediate to 15 days ( serving notice period) Mandatory skillset: Python, dataflow, Dataproc, GCP, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA Mode of Interview: L1 and L2 ( Virtual Interview) Python, Pyspark Integration & Middleware Pyspark, Python, Agile, Cloud Storage, Data Fusion, DataForm, Git, Redis, SQL, Terraform, Dataflow, Version Control, Java, Python, Pyspark Chennai Hybrid 4 days work from office

Posted 3 days ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)KeywordsPython Development,Python,Bigquery,Pandas,Dataproc,Pyspark,Data Flow,Kafka,Cloud Storage,Terraform,Oops,Cloud Spanner,Hadoop,Java,Hive,Spark,Mapreduce,Big Data,Gcp,Aws,Javascript,Mysql,Postgresql,Sql Server,Oracle,Bigtable,Software Development,Sql*

Posted 3 days ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad, chennai, bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)Keywordsdataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 3 days ago

Apply

6.0 - 7.0 years

4 - 8 Lacs

lucknow

Work from Office

Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software. Support the entire application lifecycle (concept, design, develop, test, release and support) Responsible for end-to-end product development of C++ based application. It may include application development based on Microservice Architecture. Work with developers to implement best practices, introduce new tools, and im- prove processes. Stay up to date with new technology trends. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Min 6-7 years of experience withStrong knowledge of C ++ systems programing and experienced with standard Unix, Mac and Windows development environments and tools. Experience with Network protocols like UDP and TCP Troubleshooting complex product applications and deployments Proven experience building and troubleshooting highly robust, scalable software systems. Excellent written and verbal communication skills. Preferred technical and professional experience C++, Java or Golang , Distributed systems, cloud infrastructure/Devops, REST API Specific experience building large-scale, distributed software platforms with specific knowledge of modern cloud storage platforms, APIs is a plus Experience with OpenSSL Experience in kernel-level network and file system programming desired Performance engineering Possesses deep professional knowledge of business unit processes and operations. In this position, you will develop and support the Core applications (Aspera High Speed Transfer Server, Proxy, Transfer SDK) with a focus on security, quality, scalability and performance. As a Backend Developer, you’ll use your business and professional knowledge to work across cross-functional and global teams, providing technical and operational guidance, contributing and leading to initiatives with impact across the department

Posted 4 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

lucknow

Work from Office

With fast file transfer and streaming solutions, IBM Aspera software moves data of any size across any distance. IBM Aspera takes a different approach to tackling the challenges of big data movement over global WANs. Rather than optimize or accelerate data transfer, Aspera eliminates underlying bottlenecks using a breakthrough transport technology, fully utilizing available network bandwidth to maximize speed and quickly scale up with no theoretical limit. We are looking for a talented Core Backend Developer to join our Core Team. In this position, you will develop and support the Core applications (Aspera High Speed Transfer Server, Proxy, Transfer SDK) with a focus on security, quality, scalability and performance. Required technical and professional Expertise Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Min 3-5 years of experience withStrong knowledge of C systems programing and experienced with standard Unix, Mac and Windows development environments and tools. Experience with Network protocols like UDP and TCP Troubleshooting complex product applications and deployments Proven experience building and troubleshooting highly robust, scalable software systems. Excellent written and verbal communication skills. Preferred technical and professional experience C++, Java or Golang , Distributed systems, cloud infrastructure/Devops, REST API Specific experience building large-scale, distributed software platforms with specific knowledge of modern cloud storage platforms, APIs is a plus Experience with OpenSSL Experience in kernel-level network and file system programming desired Performance engineering Possesses deep professional knowledge of business unit processes and operations. In this position, you will develop and support the Core applications (Aspera High Speed Transfer Server, Proxy, Transfer SDK) with a focus on security, quality, scalability and performance. As a Backend Developer, you’ll use your business and professional knowledge to work across cross-functional and global teams, providing technical and operational guidance, contributing and leading to initiatives with impact across the department

Posted 4 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 4 days ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

bengaluru

Work from Office

Design, build, and manage data integration solutions across hybrid and multi-cloud environments. Develop and optimize data pipelines to ensure scalability, performance, and reliability. Implement and manage cloud storage solutions such as Cloud Object Store. Leverage Apache Iceberg for scalable data lakehouse implementations. Collaborate with data architects, analysts, and engineering teams to ensure seamless integration of structured and unstructured data sources. Ensure data security, governance, and compliance across platforms. Provide technical leadership, mentoring, and guidance to junior engineers. Stay updated with emerging trends in data engineering, cloud technologies, and data integration frameworks. Required Skills & Qualifications 1218 years of strong experience in data engineering and integration. Hands-on expertise with Azure, AWS, and GCP cloud platforms. Proficiency in cloud storage technologies such as Cloud Object Store. Deep experience with Apache Iceberg for modern data lake and lakehouse implementations. Strong knowledge of ETL/ELT pipelines, data ingestion, and transformation frameworks. Proven track record in designing and deploying large-scale, enterprise-grade data integration solutions. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and leadership abilities, with experience in mentoring teams. Bacheloror Masterdegree in Computer Science, Data Engineering, or a related fiel d.

Posted 4 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,

Posted 5 days ago

Apply

4.0 - 5.0 years

4 - 5 Lacs

chittoor

Work from Office

The Librarian will be responsible for managing, organizing, and maintaining the library resources to support the academic, research, and administrative needs of students, faculty, and staff. The role involves developing library policies, facilitating digital and physical access to resources, and promoting effective use of library services. Key Responsibilities: 1. Library Management Organize, maintain, and update physical and digital collections. Ensure proper cataloguing, classification, and indexing of resources. Monitor circulation (issue, return, renewals) and maintain accurate records. Supervise library staff, student assistants, and volunteers. 2. Resource Development Identify and procure books, journals, periodicals, and digital resources relevant to the institutions academic programs. Maintain vendor relationships and manage subscriptions. Develop e-library / digital database access for students and faculty. 3. User Services Provide reference and information services to students, faculty, and researchers. Conduct orientation sessions on library usage, databases, and referencing tools. Support faculty and students in academic writing, research, and citation management. 4. Policy & Compliance Frame and implement library usage policies. Ensure compliance with institutional, UGC/AICTE/NAAC, and other regulatory requirements. Maintain proper documentation for audits and accreditation visits. 5. Technology Integration Manage library management software (LMS) for circulation, cataloguing, and reporting. Implement digital repositories, online journals, and databases. Facilitate remote access to e-resources. 6. Administration & Reporting

Posted 5 days ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

pune, bengaluru, mumbai (all areas)

Work from Office

Designing, deploying, and managing applications and infrastructure on Google Cloud. Responsible for maintaining more solutions that leverage Google-managed or self-managed services, utilizing both the Google Cloud Console and command-line interface Required Candidate profile Designing and Implementing Cloud Solutions: Deploying and Managing Applications: Monitoring and Maintaining Cloud Infrastructure: Utilizing Cloud Services: Automation and DevOps:

Posted 6 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

chennai

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 6 days ago

Apply

1.0 - 3.0 years

1 - 5 Lacs

kolkata

Work from Office

About The Role Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : GCP Dataflow Good to have skills : DevOps Architecture Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Key Responsibilities:a. Assessment of the Solution Architects' design from Operations point of view and implementation of it along with end users' communication. Onboarding of new services keeping all different aspects in mind.b. Carrying out Day-to-day operations like execution of service requests, changes, Incidents, problems, demands.c. protecting SLA and KPI. Technical Experience:a. Basic knowledge on UNIX commands, python scripting, Json, YAML, Macro driven excel.b. Indepth Knowledge and experience on several GCP services like VPC Network, network services, hybrid connecitvity, IAM & Admin, App Engine, Cloud Functions, Cloud Storage, Cloud Logging, GCP Organizations, gcloud commands, Gsuit.c. Also Good experience on Terraform and GITlab.Professional Experience:Good in verbal, written communication and presentation; interacting with clients at varying levels.Good team player. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and develop applications.- Implement best practices for application development.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated with the latest technologies and trends in application development.- Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture.- Strong understanding of cloud computing principles.- Experience with designing scalable and secure cloud-based applications.- Hands-on experience with Google Cloud services such as Compute Engine, BigQuery, and Cloud Storage.- Knowledge of DevOps practices for continuous integration and deployment. Additional Information:- The candidate should have a minimum of 3 years of experience in Google Cloud Platform Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

We are seeking a Senior Data Architect with over 7 years of experience in the field, specifically in data architecture roles. As a Senior Data Architect, your responsibilities will involve designing and implementing scalable, secure, and cost-effective data architectures utilizing Google Cloud Platform (GCP). You will play a key role in leading the design and development of data pipelines using tools such as BigQuery, Dataflow, and Cloud Storage. Additionally, you will be responsible for architecting and implementing data lakes, data warehouses, and real-time data processing solutions on GCP. It will be your duty to ensure that the data architecture is aligned with business objectives, governance, and compliance requirements. Collaboration with stakeholders to define data strategy and roadmap will be essential. Moreover, you will design and deploy BigQuery solutions for optimized performance and cost efficiency, as well as build and maintain ETL/ELT pipelines for large-scale data processing. Utilizing Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration, you will implement best practices for data security, privacy, and compliance in cloud environments. Integration of machine learning workflows with data pipelines and analytics tools will also be within your scope of work. Your expertise will be crucial in defining data governance frameworks and managing data lineage. Furthermore, you will lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimizing cloud infrastructure for scalability, performance, and reliability will enable you to mentor junior team members and guarantee adherence to architectural standards. Collaboration with DevOps teams for the implementation of Infrastructure as Code (Terraform, Cloud Deployment Manager) will be part of your responsibilities. Ensuring high availability and disaster recovery solutions are integrated into data systems, conducting technical reviews, audits, and performance tuning for data solutions, and designing multi-region and multi-cloud data architecture solutions will be essential tasks. Staying updated on emerging technologies and trends in data engineering and GCP will be crucial to driving innovation in data architecture, including recommending new tools and services on GCP. Preferred qualifications include a Google Cloud Certification, with primary skills encompassing 7+ years of data architecture experience, expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services, strong proficiency in SQL, Python, or other data processing languages, experience in cloud security, data governance, and compliance frameworks, strong problem-solving skills, and the ability to architect solutions for complex data environments. Leadership experience and excellent communication and collaboration skills are also highly valued. Role: Senior Data Architect Location: Trivandrum/Bangalore Close Date: 14-03-2025,

Posted 1 week ago

Apply

Exploring Cloud Storage Jobs in India

The cloud storage job market in India is rapidly growing as more companies are adopting cloud technology for their data storage needs. This has created numerous job opportunities for individuals with expertise in cloud storage solutions. In this article, we will explore the top hiring locations, average salary range, career path, related skills, and interview questions for job seekers interested in pursuing a career in cloud storage in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Mumbai
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industries and have a high demand for cloud storage professionals.

Average Salary Range

The average salary range for cloud storage professionals in India varies based on experience and expertise. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

A typical career path in cloud storage may involve roles such as Cloud Engineer, Cloud Architect, Cloud Developer, and Cloud Solutions Architect. The progression often follows a path from Junior Developer to Senior Developer to Tech Lead, with opportunities to specialize in specific cloud platforms or technologies along the way.

Related Skills

In addition to expertise in cloud storage solutions, professionals in this field are often expected to have skills in areas such as:

  • DevOps
  • Networking
  • Security
  • Programming languages (e.g., Python, Java)
  • Database management

Interview Questions

  • What is a virtual machine and how is it used in cloud computing? (basic)
  • Explain the difference between public, private, and hybrid clouds. (medium)
  • How do you ensure data security in a cloud storage environment? (medium)
  • What are the key benefits of using cloud storage over traditional on-premise storage solutions? (basic)
  • Can you explain the concept of scalability in cloud storage? (medium)
  • How do you handle data migration in a cloud storage environment? (medium)
  • What is the difference between SaaS, PaaS, and IaaS? (basic)
  • How do you monitor and optimize cloud storage performance? (medium)
  • Explain the concept of data redundancy in cloud storage. (basic)
  • How do you handle disaster recovery in a cloud storage environment? (medium)
  • What are the key challenges of cloud storage adoption for businesses? (medium)
  • How do you ensure compliance with data protection regulations in cloud storage? (medium)
  • What is the role of a CDN (Content Delivery Network) in cloud storage? (medium)
  • How do you handle cloud storage cost optimization for a company? (medium)
  • Explain the concept of multi-tenancy in cloud storage. (basic)
  • How do you ensure data integrity in a cloud storage environment? (medium)
  • What are the key considerations for data encryption in cloud storage? (medium)
  • How do you handle data synchronization in a multi-cloud environment? (advanced)
  • Can you explain the concept of serverless computing in cloud storage? (advanced)
  • How do you address the challenges of data sovereignty in a global cloud storage setup? (advanced)
  • What are the key differences between block, file, and object storage in the cloud? (medium)
  • How do you handle data archival and retrieval in a cloud storage environment? (medium)
  • Explain the concept of data deduplication in cloud storage. (medium)
  • How do you handle data access control and permissions in a cloud storage environment? (medium)
  • What are the key considerations for data backup and recovery in cloud storage? (medium)

Closing Remark

As you explore opportunities in cloud storage jobs in India, remember to continuously enhance your skills, stay updated on industry trends, and prepare confidently for interviews. With the right combination of expertise and preparation, you can build a successful career in this rapidly growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies