Jobs
Interviews

288 Cloud Storage Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

7.0 - 9.0 years

9 - 13 Lacs

Hyderabad, Pune

Work from Office

Key Responsibilities: 1. Cloud Infrastructure Management:o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP).o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services. 2. Kubernetes and Containerization:o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications.o Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies. 3. CI/CD Pipelines:o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD.o Automate deployment workflows for containerized and serverless applications. 4. Security and Compliance:o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption.o Conduct regular audits to ensure compliance with organizational and regulatory standards. 5. Collaboration and Support:o Work closely with development teams to containerize applications and ensure smooth deployment on GCP.o Provide support for troubleshooting and resolving infrastructure-related issues. 6. Cost Optimization:o Monitor and optimize GCP resource usage to ensure cost efficiency.o Implement strategies to reduce cloud spend without compromising performance. Required Skills and Qualifications:1. Certifications:o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification. 2. Cloud Expertise:o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub. 3. DevOps Tools:o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.o Experience with containerization tools like Docker. 4. Kubernetes Expertise:o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets.o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize. 5. Programming and Scripting:o Strong scripting skills in Python, Bash, or Go.o Familiarity with YAML and JSON for configuration management. 6. Monitoring and Logging:o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite. 7. Networking:o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers.8. Soft Skills: o Strong problem-solving and troubleshooting skills.o Excellent communication and collaboration abilities.o Ability to work in an agile, fast-paced environment.

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Key Responsibilities: 1. Cloud Infrastructure Management:o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP).o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services. 2. Kubernetes and Containerization:o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications.o Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies. 3. CI/CD Pipelines:o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD.o Automate deployment workflows for containerized and serverless applications. 4. Security and Compliance:o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption.o Conduct regular audits to ensure compliance with organizational and regulatory standards. 5. Collaboration and Support:o Work closely with development teams to containerize applications and ensure smooth deployment on GCP.o Provide support for troubleshooting and resolving infrastructure-related issues. 6. Cost Optimization:o Monitor and optimize GCP resource usage to ensure cost efficiency.o Implement strategies to reduce cloud spend without compromising performance. Required Skills and Qualifications: 1. Certifications:o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification. 2. Cloud Expertise:o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub. 3. DevOps Tools:o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.o Experience with containerization tools like Docker. 4. Kubernetes Expertise:o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets.o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize. 5. Programming and Scripting:o Strong scripting skills in Python, Bash, or Go.o Familiarity with YAML and JSON for configuration management. 6. Monitoring and Logging:o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite. 7. Networking:o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers.8. Soft Skills: o Strong problem-solving and troubleshooting skills.o Excellent communication and collaboration abilities.o Ability to work in an agile, fast-paced environment.

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Mandatory skill ETL_GCP_Bigquery Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. Work extensively with BigQuery for data processing, querying, and optimization. Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. Debug technical issues, perform root cause analysis, and provide solutions for production incidents. Ensure data quality, accuracy, and integrity across data pipelines. Collaborate with cross-functional teams to define technical requirements and deliver solutions. Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required Proficiency in SQL and PL/SQL for querying and manipulating data. Experience in Python for data processing and automation. Hands-on experience with Google Cloud Platform (GCP), particularly o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub Experience with GitHub and CI/CD pipelines for automation and deployment. Performance tuning and performance testing of ELT processes. Strong analytical and debugging skills to resolve data and pipeline issues efficiently. Self-motivated and able to work independently as an individual contributor. Good understanding of data modeling, database design, and data warehousing concepts.

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 1 month ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Kolkata

Work from Office

Design, build, and maintain data pipelines on Google Cloud Platform, using tools like BigQuery and Dataflow. Focus on optimizing data storage, processing, and analytics to support business intelligence initiatives.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Kolkata

Work from Office

Design and implement data processing solutions using Google Cloud Platform. You will focus on building scalable data pipelines and optimizing data workflows. Expertise in Google Cloud services, data engineering, and Big Data is required.

Posted 1 month ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Job Summary As a Software Engineer, your responsibilities will include developing and maintaining cloud-based solutions. You will focus on solving complex problems, developing, testing, automating and collaborating with the Software Engineering team to deploy features in a production environment. Additionally, you will be responsible for designing and implementing managed Cloud Services based on given requirements. We expect you to have excellent coding skills and take a lead role in designing and implementing managed Cloud Services. Prior experience in Filesystems would be an added advantage, and you should also have the ability to quickly learn existing code and architecture. Job Requirements • Excellent Problem solver, proficient coder and a designer. • Thorough understanding and extensive experience with Block/File technologies having hands-on experience in designing and developing software solutions. • Proficient with any of the languages C, C++ or Golang. • Experience with Python, Java/C-sharp is added advantage. Thorough understanding of Linux or other Unix-like Operating Systems. • Strong in Data Structure and algorithms. • Expertise in REST API design and implementation. • Prior experience with Filesystem development and Distributed system design is desirable. • Understanding of Container based technologies preferably Kubernetes & Dockers and experience with Cloud service APIs (e.g. AWS, Azure or GCP) is desirable. • Continuously monitor, analyze, and measure system health, availability, and latency using Google native tooling. Develop and implement steps to improve system and application performance, availability, and reliability. • Knowledge of infrastructure like hypervisor, Cloud Storage and experience with cloud services including Databases, Caching, Object and Block Storage, Scaling, Monitoring, Load Balancers, Networking etc. is added advantage. • Mentor new team members, participate in interviews, and contribute to building high-performance teams. • Work on development, bug fixes/updates, spec updates, customer RCAs and automation. • Strong oral and written communication skills. • Engage in incident management processes including the 24X7 Oncall rotations ( as per the sun model) to resolve production issues within agreed SLAs/SLOs. Education Education B.E/B.Tech or M.S in Computer Science or related technical field 5+ years of experience and must be hands-on with coding

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 17 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using BigQuery, Dataproc, PubSub, and Cloud Storage on Google Cloud Platform (GCP). Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to data pipeline failures or errors in real-time using logs analysis and debugging techniques. Develop automation scripts using Python to streamline data processing tasks and improve efficiency. Ensure compliance with security standards by implementing access controls, encryption, and monitoring mechanisms.

Posted 1 month ago

Apply

10.0 - 15.0 years

1 - 6 Lacs

Hyderabad

Hybrid

Role: GCP Data Architect Experience: 10+ years Work location: Hyderabad (Hybrid work from office) Notice period: Immediate joiners to 30 days max (But preferred would be someone who can join within 15 days) Shift timing: 2:30 PM to 11:30 PM (IST) We are looking for a GCP Data Architect with deep technical expertise in cloud-native data platforms and architecture, who also brings experience in building practices/CoEs and engaging in pre-sales solutioning. This role will be instrumental in driving cloud data transformations for enterprise clients, shaping reusable accelerators, and leading conversations with key stakeholders. Required Skills & Qualifications: 10+ years of overall experience, with 3+ years as an architect on GCP data platforms. Expertise in BigQuery, Cloud Storage, Dataflow, Pub/Sub, Looker, Data Catalog, and IAM. Hands-on experience with Terraform or similar IaC tools. Proficiency in Python, SQL, or Apache Beam for data processing pipelines. Solid grasp of data governance, security, lineage, and compliance frameworks. Strong understanding of hybrid/multi-cloud data strategies, data lakehouses, and real-time analytics. Demonstrated ability to build internal CoEs or practices with reusable assets and frameworks. Experience collaborating with CXOs, Data Leaders, and Enterprise Architects. Strong communication skillswritten and verbal—for stakeholder and customer engagement. Preferred Certifications: Professional Cloud Architect – Google Cloud Professional Data Engineer – Google Cloud

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 10 Lacs

Mumbai

Work from Office

Period :Immediate Employment Mode : Contract Description Front-End Technologies: Angular: Angular version 10,14 & 18. Experience in developing dynamic, responsive web applications using Angular, including components, services, and routing. Upgrading Angular application from lower to newer version. Good communication skills. Drive the communication with customer. Secondary skill VueJS: Proficiency in creating single-page applications (SPA) and state management with VueJS. Cloud and Infrastructure: MS Azure: Familiarity with Microsoft Azure services, including cloud storage, hosting, and managing resources within the Azure ecosystem. Third-Party Integration: Integration with External Services: Proven experience in integrating third-party APIs and services, such as Media Bank and SharePoint, into internal systems to ensure seamless data exchange and functionality.

Posted 1 month ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Position Overview: The Production Support Consultant is a critical role centered on maintaining the robustness and efficiency of our data processing environment. This position requires a hands-on approach to triaging production issues, diving deep into data problems, and refining/optimizing our data pipelines. This is a production support / Devops role which will involve active collaboration with our investment teams, data engineers and data vendors. Responsibilities: Serve as the first line of defense for production issues, swiftly triaging and prioritizing incidents to ensure minimal disruption to business operations. Liaise with external technology vendors and exchanges to coordinate changes and resolve connectivity and market data issues. Collaborate with the data engineering team to streamline and optimize data pipelines, ensuring efficient data flow and quality across the organization. Configure and optimize production vendor data sourcing jobs, ensuring they run reliably and efficiently, and address any scheduling conflicts or failures promptly. Provide timely support to investment researchers for data-related queries and challenges. Engage closely with internal teams on system upgrades, data implementations, and incident post-mortems to prevent future issues. Review and refine current operational processes, introducing tools or methods that can reduce incident recurrence and improve response times. Ability to work in a fast-paced environment and be able to work on multiple projects simultaneously. Requirements: A degree in Computer Science, Data Science, Information Systems, or a related field, or equivalent experience. Experience (7+ years) in triaging and resolving production-level issues in a timely manner. Proficiency in Python, especially in the context of data operations and pipeline optimizations. Practical experience in SQL querying, with the ability to efficiently diagnose and rectify data anomalies. Strong problem-solving skills, with an aptitude for diving deep into complex technical challenges. Familiarity with enterprise technology tools: Linux, SQL Server, GIT. Finance/market data experience: familiarity with vendors such as Bloomberg, Refinitiv, Nasdaq and others is a must. Excellent communication skills, both written and verbal, with the ability to articulate technical issues to non-technical stakeholders. Ability to work during New York daytime hours or New York evening/overnight hours Nice to Haves: Hands-on knowledge of Tidal Enterprise Scheduler, including configuring, optimizing, and troubleshooting. Experience in Python-based data engineering, particularly in optimizing data pipelines and integrations. Understanding of cloud fundamentals and best practices in cloud storage, permissions and cost management.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Solution Design & Architecture Implementation & Deployment Technical Leadership & Guidance Client Engagement & Collaboration Performance Monitoring & Optimization Your Profile Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3-8 years of experience in designing, implementing, and managing data solutions. 3-8 years of hands-on experience working with Google Cloud Platform (GCP) data services. Strong expertise in core GCP data services, including BigQuery (Data Warehousing) Cloud Storage (Data Lake) Dataflow (ETL/ELT) Cloud Composer (Workflow Orchestration - Apache Airflow) Pub/Sub and Dataflow (Streaming Data) Cloud Data Fusion (Graphical Data Integration) Dataproc (Managed Hadoop and Spark) Proficiency in SQL and experience with data modeling techniques. Experience with at least one programming language (e.g., Python, Java, Scala). Experience with Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. Understanding of data governance, security, and compliance principles in a cloud environment. Experience with CI/CD pipelines and DevOps practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 1 month ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Chennai

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you ’ ll be able to reimagine what ’ s possible. Join us and help the world ’ s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Deliver chat & voice solutions according to requirements (GCP + Dialogflow, GCP + AWS, GCP + Cisco CVP) Integrate chat and voice solutions with Google services (gChat, Cloud Storage, gDrive, gMail, gSheet, ect.) Integrate chat and voice solutions with business applications (SAP, Service Now, Salesforce, etc.) Discover capabilities of available APIs and consume them to meet solution requirements Write micro applications in TypeScript & Node.js to achieve desired functionalities & automations Discover best ways to deliver requirements within GCP using available technology stack Primary Skills Very good TypeScript skills (optionally Node.js) Very good understanding of authorisations concepts (API key, OAuth) Broad experience in exploring and consuming APIs through various methods Good understanding of Pub/Sub concept in messaging systems Technical / informatics background with coding mindset Skilled to discover ways of meeting solution requirements with available technology stack Secondary Skills Basic front-end skills (React) Experience in working in Google environment (GCP, Dialogflow) Kore.ai Oracle Digital Assistant Google DialogFlow Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 1 month ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Pune

Work from Office

: Job TitleSAS OCR Associate Engineer Corporate TitleAnalyst LocationPune, India Role Description Shared Application & Services defines the standardized use and further development of shared platforms and applications. As a result, SAS offers reusable, scalable and cost-efficient solutions for your E2E business processes. A SAS OCR Associate Engineer is responsible for designing, implementing, and managing cloud infrastructure and services on Google Cloud. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Cloud Infrastructure Management Design, deploy, and manage scalable, secure, and cost-effective cloud environments on GCP . Automation & Scripting Develop Infrastructure as Code (IaC) using Terraform, Deployment Manager, or other tools. Security & Compliance Implement security best practices, IAM policies, and ensure compliance with organizational and regulatory standards. Networking & Connectivity Configure and manage VPCs, subnets, firewalls, VPNs, and interconnects for secure cloud networking. CI/CD & DevOps Set up CI/CD pipelines using Cloud Build, Jenkins, GitHub Actions, or similar tools for automated deployments. Monitoring & Logging Implement monitoring and alerting using Cloud Logging & Monitoring, or third-party tools. Cost Optimization Analyze and optimize cloud spending by leveraging committed use discounts, autoscaling, and right-sizing resources. Disaster Recovery & Backup Design backup, high availability, and disaster recovery strategies using Cloud Storage, Snapshots, and multi-region deployments. Database Management Deploy and manage GCP databases like Cloud SQL. Containerization & Kubernetes Deploy and manage containerized applications using GKE (Google Kubernetes Engine) Your skills and experience Bachelors Degree from an accredited college or university with a concentration in Computer Science or equivalent. More than 3 years of relevant work experience. Strong experience with GCP services like Cloud Storage, IAM, Networking, Kubernetes, and Serverless technologies. Proficiency in Infrastructure as Code (Terraform). Knowledge of DevOps practices, CI/CD tools, and GitHub Actions workflows. Understanding of security, IAM, networking, and compliance in cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. Work within an agile methodology, participating in sprint planning, daily stand-ups, and retrospectives. Collaborate with stakeholders, including business analysts, product managers, and other team members. Keep up with industry trends, new technologies, and best practices. Be flexible and adapt to evolving project requirements and technologies. Good to haveGCP Document AI, Software development experience in Java(Spring Boot), Python(FastAPI), React. How well support you . . . .

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Key Responsibilities: Design, develop, test, and deploy Python applications on GCP. Leverage GCP services (e.g., Cloud Functions, BigQuery, Pub/Sub, Cloud Storage, GKE) to build scalable solutions. Integrate APIs and third-party services into cloud applications. Write efficient, reusable, and testable Python code. Collaborate with DevOps and Data Engineering teams to optimize cloud infrastructure and pipelines. Monitor application performance and implement improvements. Troubleshoot and resolve issues across multiple environments (dev/test/prod). Ensure adherence to security and compliance standards in cloud deployments. Contribute to architectural discussions and decisions related to cloud-native designs.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Noida, Bhubaneswar, Greater Noida

Work from Office

GA4, Firebase Analytics BigQuery (SQL, optimization, partitioning, clustering) Looker / Looker Studio (dashboarding, modeling) GCP tools: Cloud Storage, Pub/Sub, Dataflow, Functions GCP certifications, A/B testing, product analytics preferred

Posted 1 month ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Bengaluru

Work from Office

We are looking for some good offshore resources who can start in a week or 2. Should clear client evaluations too. Snowflake Data Engineer: Location (Offshore) 1. Should have minimum 8+ years of experience in IT with Data Engineering in Snowflakes. 2. Should have good experience in creation and designing data pipeline in Snowflake according to the business requirement. 3. Must have experience with DBT and Python as well. 4. Proven hands-on experience with Snowflake architecture, features (e.g., Snowpipe, Tasks, Streams, Time Travel, Cloning, Snowsight, Dynamic Tables, External Functions). 5. Expertise in complex data integration from various sources into target systems through intermediatory databases, as well as skilled in intricate data parsing, transformation, and sourcing data for analytical and QA testing purposes. 6. Develop and optimize complex SQL queries, stored procedures, and functions within Snowflake for data transformation and analysis with high performance. 7. Must have good stakeholder connects and able to drive design decisions 8. Excellent analytical and problem-solving skills with an ability to troubleshoot complex data issues. 9. Experience with cloud storage services like AWS and integrating them with Snowflake. 10. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data modeling (dimensional, 3NF). 11. Should be a stand alone contributor willing to work collaboratively with client in some overlap timing.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Chennai

Work from Office

We are looking for a Data Visualization Specialist with experience to design, develop, and optimize data dashboards and reports. The ideal candidate will have expertise in QlikView V12 or Spotfire and will work closely with business teams to transform raw data into meaningful insights. Key Responsibilities: Design, develop, and maintain QlikView V12 / Spotfire dashboards and reports. Gather business requirements and translate them into data visualization solutions. Optimize QlikView expressions, scripts, and Spotfire transformations for better efficiency. Work with large datasets and various data sources, including relational databases, cloud storage, and APIs. Collaborate with stakeholders, business analysts, and data teams to enhance reporting capabilities. Perform data validation, quality checks, and troubleshooting to ensure data accuracy. Implement best practices for data visualization, governance, and security. Provide user training and documentation for business users and internal teams. Stay updated with the latest BI trends, tools, and technologies to improve reporting solutions. Required Skills: Strong experience in QlikView V12 and/or Spotfire. Proficiency in writing SQL queries and database optimization. Experience working with large datasets and multiple data sources. Strong analytical and problem-solving skills. Ability to work independently and collaborate with cross-functional teams. Excellent verbal and written communication skills.

Posted 1 month ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

As part of our journey to make our most critical business services of Cloud storage, UBS is seeking a junior cloud reliability engineer to join our Technology Services Team. You will be an integrated member of this global team. This role will be a highly visible role to the Storage leadership team and will work closely with the leadership team for a global technology product that is being launched for the bank. The role is an individual contributor role and based in India. The Cloud Azure storage, is a part of Distributed Hosting in Technology services. Its a truly global team, distributed across India, Poland, Switzerland, United Kingdom and United States. You will be based in Pune, EON and be an integrated member of this global team. Must have skills 5+ years working with Azure Azure Storage (Blob, ADLS Gen 2, Azure Files,) Strong DevOps concepts, CI/CD, Pipelines, Build/Release process, Infrastructure as Code Hands-on experience and development of ARM templates and Terraform Scripting expertise using Azure PowerShell or Azure CLI tools Solid understanding of Azure Policy and developing custom policy is a major plus Good to have skills Azure Storage (Queue, Tables, Azure NetApp Files, Azure Backup (Recovery Services / Backup Vault) Storage Utilities Experience (AzCopy, Storage Explorer, Defender of Storage) Working knowledge of QA and testing within a Devops environment Working knowledge of security concepts such as encryption, VNETS, Private Link, etc. Experience working in highly regulated and governed industries such as financial and healthcare Craft and improve technical documentation How does a typical day look like Agile Scum Environment - Daily Standup, Weekly Backlog Refinement, Biweekly Sprint Review, Planning, & Retrospectives Production stability actionable that require engineering and reliability approach towards fast remediation/solutioning aspects ex. Automation scope, Dashboarding views.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.

Posted 1 month ago

Apply

12.0 - 15.0 years

15 - 19 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 1 month ago

Apply

6.0 - 11.0 years

35 - 90 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

We are looking for a cloud engineer to join our team and work with our engineering team to optimize, implement and maintain an organization's cloud-based systems.

Posted 1 month ago

Apply

7.0 - 12.0 years

35 - 90 Lacs

Hyderabad, Pune, Bangalore/ Bengaluru

Hybrid

looking for a competent cloud architect to manage our cloud architecture and positioning in the cloud environment. It plays a strategic role in maintaining all cloud systems, including front-end platforms, servers, storage, and management networks.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies