Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 11.0 years
11 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Do you want to build the next generation of Linode cloud storage? Do you love leading and motivating successful teams? Join the Linode Storage Engineering Team. Our Storage Team is responsible for developing and maintaining Akamais cloud storage platform for Linode. Our enterprise grade solution is an essential element to Akamais Connected Cloud platform. We provide a scalable and secure storage solution for Linodes customers to build their businesses. Partner with the best As a Senior Software Engineer II Lead, focus on building highly usable, reliable and scalable storage solutions. Our platform empowers enterprises to create modern solutions to the worlds problems. As an SSE II Lead, you will be leading a team working on storage services used by both our customers and our internal teams: Creating new features, or enhance existing functionality, from design through testing and deployment. Working in an agile sprint environment to deliver quality software on a regular and consistent basis. Measuring and optimizing distributed system performance. Collaborating with our architecture, QA and operational teams to create a world class storage service. Improving internal processes and systems on a consistent basis Managing and developing the team through hiring, training, coaching and mentoring Do what you love To be successful in this role you will: Have relevant experience of 7+ years and a Bachelors Degree in Computer Science or equivalent Have relevant experience in development using Golang/C++ (or both) & prior experience working with Kubernetes. Have extensive experience developing software in a distributed, micro-services environment. Have a track record of delivering reliable and secure software. Have experience managing/mentoring teams, working in an agile DevOps environment Have good knowledge of object storage and block storage technologies within cloud environments. Work in a way that works for you Learn what makes Akamai a great place to work Connect with us on social and see what life at Akamai is like! We power and protect life online, by solving the toughest challenges, together. At Akamai, were curious, innovative, collaborative and tenacious. We celebrate diversity of thought and we hold an unwavering belief that we can make a meaningful difference. Our teams use their global perspectives to put customers at the forefront of everything they do, so if you are people-centric, youll thrive here. Working for you At Akamai, we will provide you with opportunities to grow, flourish, and achieve great things. Our benefit options are designed to meet your individual needs for today and in the future. We provide benefits surrounding all aspects of your life: Your health Your finances Your family Your time at work Your time pursuing other endeavors Our benefit plan options are designed to meet your individual needs and budget, both today and in the future. About us Join us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clients? Come join us and grow with a team of people who will energize and inspire you!
Posted 2 months ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Storage Engineer to manage and optimize enterprise storage systems. Ideal for professionals with a strong background in data storage, backup, and disaster recovery. Key Responsibilities: Design and manage SAN, NAS, and object storage solutions Perform storage provisioning, replication, and tuning Implement backup and recovery strategies Monitor storage health, performance, and capacity Required Skills & Qualifications: Experience with EMC, NetApp, Pure Storage, or similar platforms Proficiency in storage protocols (iSCSI, NFS, FC, SMB) Familiarity with storage automation and scripting Bonus: Cloud storage (S3, Azure Blob, GCP Cloud Storage) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Noida
Remote
Role & responsibilities As a Data Engineer with a focus on pipeline migration from SAS to Google Cloud Platform (GCP) technologies, you will tackle intricate problems and create value for our business by designing and deploying reliable, scalable solutions tailored to the companys data landscape. You will be responsible for the development of custom-built data pipelines on the GCP stack, ensuring seamless migration of existing SAS pipelines. Responsibilities: Design, develop, and implement data pipelines on the GCP stack, with a focus on migrating existing pipelines from SAS to GCP technologies. Develop modular and reusable code to support complex ingestion frameworks, simplifying the process of loading data into data lakes or data warehouses from multiple sources. Collaborate with analysts and business process owners to translate business requirements into technical solutions. Utilize your coding expertise in scripting languages (Python, SQL, PySpark) to extract, manipulate, and process data effectively. Leverage your expertise in various GCP technologies, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI, to enhance data warehousing solutions. Maintain high standards of development practices, including technical design, solution development, systems configuration, testing, documentation, issue identification, and resolution, writing clean, modular, and sustainable code. Understand and implement CI/CD processes using tools like Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Participate in data quality and validation processes to ensure data integrity and reliability. Optimize performance of data pipelines and storage solutions, addressing bottlenecks. Collaborate with security teams to ensure compliance with industry standards for data security and governance. Communicate technical solutions engineering teams and business stakeholders. Required Skills & Qualifications: 5-13 years of experience in software development, data engineering, business intelligence, or a related field, with a proven track record in manipulating, processing, and extracting value from large datasets. Extensive experience with GCP technologies in the data warehousing space, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI. Proficient in Python, SQL, and PySpark for data manipulation and pipeline creation. Experience with SAS, SQL Server, and SSIS is a significant advantage, particularly for transitioning legacy systems to modern GCP solutions. Ability to develop reusable, modular code for complex ingestion frameworks and multi-use pipelines. Understanding of CI/CD processes and tools, such as Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker. Proven experience in migrating data pipelines from SAS to GCP technologies. Strong problem-solving abilities and a proactive approach to identifying and implementing solutions. Familiarity with industry best practices for data security, data governance, and compliance in cloud environments. Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills, with the ability to advocate for technical solutions to a diverse audience including engineering teams, and business stakeholders. Willingness to work in the afternoon shift from 3 PM to 12 AM IST.
Posted 2 months ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Operations Engineer to manage and optimize cloud-based environments. Ideal for engineers passionate about automation, monitoring, and cloud-native technologies. Key Responsibilities: Maintain cloud infrastructure (AWS, Azure, GCP) Automate deployments and system monitoring Ensure availability, performance, and cost optimization Troubleshoot incidents and resolve system issues Required Skills & Qualifications: Hands-on experience with cloud platforms and DevOps tools Proficiency in scripting (Python, Bash) and IaC (Terraform, CloudFormation) Familiarity with logging/monitoring tools (CloudWatch, Datadog, etc.) Bonus: Experience with Kubernetes or serverless architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
15.0 - 20.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Cloud Based Service Management Process Design Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a vital link between clients and Accenture's operations teams, facilitating support and managing escalations. Your typical day will involve communicating the health of service delivery to stakeholders, addressing performance issues, and ensuring that cloud orchestration and automation capabilities are functioning optimally. You will hold performance meetings to discuss data and trends, ensuring that services meet the expected service level agreements with minimal downtime, thereby contributing to the overall efficiency and effectiveness of cloud services. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate regular communication between clients and internal teams to ensure alignment on service delivery.- Analyze performance metrics and prepare reports to inform stakeholders of service health and areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Based Service Management Process Design.- Strong understanding of cloud service models and deployment strategies.- Experience with cloud orchestration tools and automation frameworks.- Ability to analyze and interpret service performance data.- Familiarity with incident management and escalation processes. Additional Information:- The candidate should have minimum 5 years of experience in Cloud Based Service Management Process Design.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
6.0 - 8.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
6.0 - 11.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Your role In this role you will play a key role in Develop and implement Generative AI / AI solutions on Google Cloud Platform Work with cross-functional teams to design and deliver AI-powered products and services Work on developing, versioning and executing Python code Deploy models as endpoints in Dev Environment Must Have Skills Solid understanding of python Experience with deep learning frameworks such as TensorFlow, PyTorch, or JAX Experience with natural language processing (NLP) and machine learning (ML) Experience on Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI etc Hands on experience with Generative AI support in Vertex, specifically handson experience with Generative AI models like Gemini, vertex Search etc Your Profile 4-6+ years of experience in AI development Experience with Google Cloud Platform specifically delivering an AI solution on VertexAI platform Experience in developing and deploying AI Solutions What youll love about working here ChoosingCapgemini means having the opportunity to make a difference, whetherfor the worlds leading businesses or for society. It means when the futuredoesnt look as bright as youd like, youhave the opportunity tomake changetorewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. You can exponentially grow your career by being part of innovative projects and taking advantage of our extensiveLearning & Development programs. With us, you will experience aninclusive , safe, healthy, andflexible work environment to bring out the bestin you! You also get a chance to make positive social change and build a better world by taking an active role in ourCorporate Social Responsibility andSustainability initiatives. And whilst you make a difference, you will also have a lot offun . About Capgemini:
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills "4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required qualifications to be successful in this role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills 4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required Qualifications To Be Successful In This Role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 2 months ago
4.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Data Transformation: Utilize Data Build Tool (dbt) to transform raw data into curated data models according to business requirements. Implement data transformations and aggregations to support analytical and reporting needs. Orchestration and Automation: Design and implement automated workflows using Google Cloud Composer to orchestrate data pipelines and ensure timely data delivery. Monitor and troubleshoot data pipelines, identifying and resolving issues proactively. Develop and maintain documentation for data pipelines and workflows. GCP Expertise: Leverage GCP services, including BigQuery, Cloud Storage, and Pub/Sub, to build a robust and scalable data platform. Optimize BigQuery performance and cost through efficient query design and data partitioning. Implement data security and access controls in accordance with banking industry standards. Collaboration and Communication: Collaborate with Solution Architect and Data Modeler to understand data requirements and translate them into technical solutions. Communicate effectively with team members and stakeholders, providing regular updates on project progress. Participate in code reviews and contribute to the development of best practices. Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines using Google Cloud Dataflow to ingest data from various sources, including relational databases (RDBMS), data streams, and files. Implement data quality checks and validation processes to ensure data accuracy and consistency. Optimize data pipelines for performance and cost-effectiveness. Banking Domain Knowledge (Preferred): Understanding of banking data domains, such as customer data, transactions, and financial products. Familiarity with regulatory requirements and data governance standards in the banking industry. Required Experience: Bachelor's degree in computer science, Engineering, or a related field. ETL Knowledge. 4-9 years of experience in data engineering, with a focus on building data pipelines and data transformations. Strong proficiency in SQL and experience working with relational databases. Hands-on experience with Google Cloud Platform (GCP) services, including Dataflow, BigQuery, Cloud Composer, and Cloud Storage. Experience with data transformation tools, preferably Data Build Tool (dbt). Proficiency in Python or other scripting languages is a plus. Experience with data orchestration and automation. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Experience with data streams like Pub/Sub or similar. Experience in working with files such as CSV, JSON and Parquet. Primary Skills: GCP, Dataflow, BigQuery, Cloud Composer, Cloud Storage, Data Pipeline, Composer, SQL, DBT, DWH Concepts. Secondary Skills: Python, Banking Domain knowledge, pub/sub, Cloud certifications (e.g. Data engineer), Git or any other version control system.
Posted 2 months ago
3.0 - 7.0 years
10 - 20 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Title: Senior Data Engineer Job Summary: We are looking for an experienced Senior Data Engineer with 5+ years of hands-on experience in cloud data engineering platforms, specifically AWS, Databricks, and Azure. The ideal candidate will play a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support our analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and optimize scalable data pipelines using AWS services (e.g., S3, Glue, Redshift, Lambda). Build and maintain ETL/ELT workflows leveraging Databricks and Apache Spark for processing large datasets. Work extensively with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and Azure Databricks. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver high-quality data solutions. Ensure data quality, reliability, and security across multiple cloud platforms. Monitor and troubleshoot data pipelines, implement performance tuning, and optimize resource usage. Implement best practices for data governance, metadata management, and documentation. Stay current with emerging cloud data technologies and industry trends to recommend improvements. Required Qualifications: 5+ years of experience in data engineering with strong expertise in AWS , Databricks , and Azure cloud platforms. Hands-on experience with big data processing frameworks, particularly Apache Spark. Proficient in building complex ETL/ELT pipelines and managing data workflows. Strong programming skills in Python, Scala, or Java. Experience working with structured and unstructured data in cloud storage solutions. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with CI/CD pipelines and DevOps practices in cloud environments. Strong analytical and problem-solving skills with an ability to work independently and in teams. Preferred Skills: Experience with containerization and orchestration tools (Docker, Kubernetes). Familiarity with machine learning pipelines and tools. Knowledge of data modeling, data warehousing, and analytics architecture.
Posted 2 months ago
3.0 - 7.0 years
10 - 20 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years’ experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 2 months ago
2.0 - 7.0 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 2 months ago
7.0 - 12.0 years
20 - 25 Lacs
Chennai, Bengaluru
Work from Office
We are looking for a Senior GCP Data Engineer / GCP Technical Lead with strong expertise in Google Cloud Platform (GCP), Apache Spark, and Python to join our growing data engineering team. The ideal candidate will have extensive experience working with GCP data services and should be capable of leading technical teams, designing robust data pipelines, and interacting directly with clients to gather requirements and ensure project delivery. Project Duration : 1 year and extendable Role & responsibilities Design, develop, and deploy scalable data pipelines and solutions using GCP services like DataProc and BigQuery. Lead and mentor a team of data engineers to ensure high-quality deliverables. Collaborate with cross-functional teams and client stakeholders to define technical requirements and deliver solutions aligned with business goals. Optimize data processing and transformation workflows for performance and cost-efficiency. Ensure adherence to best practices in cloud data architecture, data security, and governance. Mandatory Skills: Google Cloud Platform (GCP) especially DataProc and BigQuery Apache Spark Python Programming Preferred Skills: Experience in working with large-scale data processing frameworks. Exposure to DevOps/CI-CD practices in a cloud environment. Hands-on experience with other GCP tools like Cloud Composer, Pub/Sub, or Cloud Storage is a plus. Soft Skills: Strong communication and client interaction skills. Ability to work independently and as part of a distributed team. Excellent problem-solving and team management capabilities.
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud Platform Architecture Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and develop applications.- Implement best practices for application development.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated with the latest technologies and trends in application development.- Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture.- Strong understanding of cloud computing principles.- Experience with designing scalable and secure cloud-based applications.- Hands-on experience with Google Cloud services such as Compute Engine, BigQuery, and Cloud Storage.- Knowledge of DevOps practices for continuous integration and deployment. Additional Information:- The candidate should have a minimum of 3 years of experience in Google Cloud Platform Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
3.0 - 5.0 years
10 - 13 Lacs
Chennai
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 2 months ago
5.0 - 8.0 years
20 - 22 Lacs
Chennai
Work from Office
Minimum 5 Years of in-depth experience in Java/Spring Boot Minimum 3 Years of Experience in Angular ability to develop rich UI screens and custom/re-usable components. Minimum 2 Years of GCP experience working in GCP Big Query, Google Cloud Storage, Cloud Run, PubSub. Minimum 2 of experience in using CI/CD pipelines like Tekton. 1-2 Years of experience in deploying google cloud services using Terraform. Experience mentoring other software engineers and delivering systemic change across 5+ years of experience in J2EE
Posted 2 months ago
2.0 - 5.0 years
1 - 4 Lacs
Bengaluru
Work from Office
Key Responsibilities Support day-to-day coordination and communication within the production team Manage and organize video assets, folders, and project files Track progress of ongoing promos and ensure timelines are met Maintain updated records of edits, versions, and approvals Assist with basic documentation and workflow setup Demonstrate strong organizational and time management skills Show familiarity with cloud storage platforms like Google Drive and Dropbox Communicate effectively and maintain a proactive mindset Display interest in digital content production and short-form video
Posted 2 months ago
10.0 - 14.0 years
10 - 16 Lacs
Pune
Work from Office
Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.
Posted 2 months ago
5.0 - 8.0 years
17 - 20 Lacs
Kolkata
Work from Office
Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps
Posted 2 months ago
1.0 - 5.0 years
3 - 7 Lacs
Chandigarh
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 2 months ago
1.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 2 months ago
4.0 - 6.0 years
4 - 8 Lacs
Pune
Work from Office
We are looking for an experienced AWS Cloud Engineer to join our technical team. The ideal candidate should have a minimum of 3+ years of hands-on experience working with AWS Cloud and services. The role requires expertise in containerization, Kubernetes, and a solid understanding of various AWS cloud services. Key Responsibilities : - Design, implement, and maintain cloud infrastructure on AWS using services like EKS, ECR, IAM, VPC, Route 53, S3, and more. - Work with AWS container services and Kubernetes to deploy, manage, and scale applications in Amazon EKS (Elastic Kubernetes Service). - Provide continuous support and maintenance of AWS-based infrastructure, including compute, storage, networking, and security. - Ensure the security and compliance of the AWS environment using tools such as AWS Control Tower, AWS Trusted Advisor, and AWS Security Hub. - Terraform and Ansible for Infrastructure as Code (IaC) implementations and automation. - Collaborate with DevOps teams to implement and support CI/CD pipelines, utilizing tools like Jenkins, Git, GitHub/Bitbucket, Maven, Docker, and Helm. - Monitor and optimize cloud infrastructure performance using tools such as AWS CloudWatch and Dynatrace. - Work closely with security teams to implement AWS security best practices and ensure compliance. Required Skills & Experience : - Minimum of 4+ years of experience working in AWS Cloud environments. - In-depth experience with AWS Kubernetes Services (EKS) and Elastic Container Registry (ECR). - Proficiency in key AWS services such as IAM, VPC, Route 53, ELB, S3, and AWS Security Services (Control Tower, Trusted Advisor, Security Hub). - Experience in configuring and managing AWS Compute, Storage, Networking, Security, and Costing services. - Strong experience with CI/CD pipelines, AWS DevOps tools (Jenkins, Git, GitHub, Bitbucket), and containerization technologies (Docker, Kubernetes). - Experience with monitoring tools such as Dynatrace and AWS CloudWatch. - Proficiency in Linux system administration. - Experience with Infrastructure as Code (IaC) tools like Ansible and Terraform is highly preferred. - AWS Certifications (AWS Certified Solutions Architect, AWS Certified DevOps Engineer) are a plus.
Posted 2 months ago
1.0 - 3.0 years
10 - 15 Lacs
Kolkata, Gurugram, Bengaluru
Hybrid
Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough