Jobs
Interviews

15 Aws Msk Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

bangalore, karnataka

On-site

Role Overview: You will be responsible for architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions. Your role will involve designing scalable data architectures with Snowflake, integrating cloud technologies such as AWS, Azure, GCP, and ETL/ELT tools like DBT. Additionally, you will guide teams in proper data modeling, transformation, security, and performance optimization. Key Responsibilities: - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Design scalable data architectures with Snowflake and integrate cloud technologies like AWS, Azure, GCP, and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Load data from disparate data sets and translate complex functional and technical requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Implement data security and data access controls and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Utilize AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Implement Lambda and Kappa Architectures - Utilize Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Utilize AWS compute services like AWS EMR, Glue, and Sagemaker, as well as storage services like S3, Redshift, and DynamoDB - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Spark core, SQL, and Spark Streaming - Experience with flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Qualifications Required: - 8-12 years of relevant experience - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python/Java - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Proficiency in AWS, Data bricks, and Snowflake data warehousing, including SQL, Snow pipe - Experience in data security, data access controls, and design - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Good knowledge of Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Good experience with AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, SQL, and Spark Streaming - Experience in one of the flow tools like Airflow, Nifi, or Luigi - Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Kindly share your profiles on dhamma.b.bhawsagar@pwc.com if you are interested in this opportunity.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As an AWS Senior Developer at PwC Advisory Acceleration Center, your role involves interacting with Offshore Manager/ Onsite Business Analyst to understand requirements and taking responsibility for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Your strong experience in AWS cloud technology, planning, organization skills, and ability to work as a cloud developer/lead on an agile team to provide automated cloud solutions will be crucial for success. **Key Responsibilities:** - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Implement Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Utilize Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java - Translate complex requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Ensure data security, access controls, and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Proficient in Lambda and Kappa Architectures - Utilize AWS EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and Streaming Services like AWS Kinesis, AWS SQS, AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Experience in flow tools like Airflow, Nifi, or Luigi - Familiarity with Application DevOps tools (Git, CI/CD Frameworks) - Jenkins or Gitlab - Use AWS CloudWatch, Cloud Trail, Account Config, Config Rules - Understand Cloud data migration processes, methods, and project lifecycle - Apply analytical & problem-solving skills effectively - Demonstrate good communication and presentation skills **Qualifications Required:** - BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA **Desired Knowledge / Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Knowledge in Big Data ML toolkits like Mahout, SparkML, or H2O - Proficiency in Python - Work experience in Offshore/Onsite Engagements - Familiarity with AWS services like STEP & Lambda In addition to your technical skills, your ability to travel to client locations as per project requirements will be essential for this role. PwC Advisory Acceleration Center in Bangalore offers a high-performance culture based on passion for excellence, diversity, and inclusion, providing you with global leadership development frameworks and the latest digital technologies to support your career growth. Apply now if you believe PwC is the place where you can learn, grow, and excel.,

Posted 5 days ago

Apply

3.0 - 5.0 years

12 - 14 Lacs

mumbai

Work from Office

Department: DevOps Industry: Information & Technology, preferably logistics & Freight Location: Mumbai Experience Range : 3-5 years Basic Qualification: BE/B-TECH/MCA/BCA or bachelors degree in related subject Travel Requirements: No Whats the Role? Help us keep builds, deployments, and infrastructure reliable for a logistics-tech SaaS built on MERN + GraphQL , Kafka (AWS MSK), MongoDB (Atlas), and AWS (EC2, S3, Lambda, API Gateway, CloudFront, Route 53, CodePipeline/CodeBuild/CodeDeploy, SES). We also maintain a customer ERP built in .NET + MS SQL . You’ll collaborate with product engineers (and, as needed, data scientists) to ensure smooth releases and stable operations. Responsibilities/ Duties Implement and maintain CI/CD pipelines using GitHub and AWS CodePipeline/CodeBuild/CodeDeploy (or GitHub Actions) for Node.js/React and Lambda services. Manage core AWS services: EC2, S3, VPC/IAM/KMS, API Gateway, CloudFront, Route 53, Lambda, SES, and foundational MSK operations. Write and maintain Infrastructure as Code with Terraform or CDK/CloudFormation (networking, security groups, load balancers, certificates). Set up monitoring, logging, and alerting with CloudWatch (and optionally Prometheus/Grafana/DataDog) and keep alerts actionable. Support day-to-day Kafka/MSK needs (topic configuration, retention, basic ACLs, consumer-lag visibility). Handle MongoDB Atlas basics (backups/alerts, connectivity, performance snapshots, private endpoints/peering). Contribute to release management, environment configuration, and safe rollback paths; keep runbooks/documentation current. Follow security best practices: least-privilege IAM, secrets in SSM/Secrets Manager, TLS/ACM hygiene, basic WAF setup. Collaborate with the Data Science team as needed to: wire data ingestion to S3/MSK, schedule simple batch jobs (EventBridge/Step Functions), provision compute (including occasional GPU EC2), containerize jobs, and add basic monitoring for data/model workflows. (Occasional) Assist with Windows/.NET + MS SQL operational tasks for the customer ERP (backups, scheduled jobs, patching). Criteria for the Role! 3–5 years in DevOps/SRE/Platform roles supporting production web applications on AWS. Hands-on with several of: EC2, S3, Lambda, API Gateway, CloudFront, Route 53, IAM, CloudWatch, SES, and MSK fundamentals. Practical CI/CD experience for Node.js/React services and artifact/image workflows. IaC on real projects (Terraform or CDK/CloudFormation). Familiarity with MongoDB Atlas operations and Kafka concepts. Clear communication and a collaborative working style. Nice to Have Cost tagging/budgets, GuardDuty/Security Hub, WAF/Shield basics. Docker familiarity; exposure to ECS/EKS. Windows/SQL Server basics (backups/restores). Any AWS certification.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

bengaluru

Work from Office

Why this job matters Cloud Native Java Developer - To Individually contribute and Drive transformation of our existing Java microservices deployed on Amazon Elastic Kubernetes Service (EKS) to serverless AWS Lambda functions , below are the Roles and Responsibilities What youll be doing Key Responsibilities Develop and deploy serverless applications using Quarkus/Spring Boot and AWS Lambda Build RESTful APIs and event-driven microservices using cloud-native patterns Optimize cold-start performance using GraalVM native images Integrate with AWS services such as AWS API Gateway, S3, DynamoDB, CloudWatch and Postgres Implement and manage Lambda authorizers (custom and token-based) for securing APIs Design and configure AWS API Gateway for routing, throttling, and securing endpoints Integrate OAuth 2.0 authentication flows using Azure Active Directory as the identity provider Descent Understanding of resilience patterns Write unit and integration tests using JUnit, Mockito, and Quarkus testing tools Collaborate with DevOps teams to automate deployments using AWS SAM, CDK, or Terraform Monitor and troubleshoot production issues using AWS observability tools Migration Responsibilities Analyse existing Spring Boot microservices deployed on Kubernetes to identify candidates for serverless migration Refactor services to be stateless, event-driven, and optimized for short-lived execution Replace Kubernetes ingress and service discovery with API Gateway and Lambda triggers Migrate persistent state and configuration to AWS-native services (e.g., DynamoDB, S3, Secrets Manager) Redesign CI/CD pipelines to support serverless deployment workflows Ensure performance, cost-efficiency, and scalability in the new architecture Document migration strategies, patterns, and best practices for future reference Technical Proficiency Strong industry expereince of 4+ years with command of Java 8+, with deep understanding of: Functional interfaces (Function, Predicate, Supplier, Consumer) Streams API, lambda expressions, and Optional Proficiency in Java concurrency, including: Thread management, Executor Service, Completable Future, and parallel streams Designing thread-safe components and understanding concurrency pitfalls Understanding of AWS EKS (Elastic Kubernetes Service) , Docker Containers and Kubernetes fundamentals: Experience with resource requests and limits, pod autoscaling, and K8s networking Familiarity with transitioning workloads from EKS to serverless environments.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

At PwC, the focus in data and analytics engineering is on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. You play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will concentrate on designing and building data infrastructure and systems to enable efficient data processing and analysis. Your responsibilities include developing and implementing data pipelines, data integration, and data transformation solutions. As an AWS Architect / Manager at PwC - AC, you will interact with Offshore Manager/Onsite Business Analyst to understand the requirements and will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Strong experience in AWS cloud technology is required, along with planning and organization skills. You will work as a cloud Architect/lead on an agile team and provide automated cloud solutions, monitoring the systems routinely to ensure that all business goals are met as per the Business requirements. **Position Requirements:** **Must Have:** - Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python / Java - Design scalable data architectures with Snowflake, integrating cloud technologies (AWS, Azure, GCP) and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Experience in load from disparate data sets and translating complex functional and technical requirements into detailed design - Deploying Snowflake features such as data sharing, events, and lake-house patterns - Experience with data security and data access controls and design - Understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Good knowledge of AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Proficient in Lambda and Kappa Architectures - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Knowledge of Big Data frameworks and related technologies with experience in Hadoop and Spark - Strong experience in AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql, and Spark Streaming - Experience in flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules - Understanding of Cloud data migration processes, methods, and project lifecycle - Business/domain knowledge in Financial Services/Healthcare/Consumer Market/Industrial Products/Telecommunication, Media and Technology/Deal advisory along with technical expertise - Experience in leading technical teams, guiding and mentoring team members - Analytical & problem-solving skills - Communication and presentation skills - Understanding of Data Modeling and Data Architecture **Desired Knowledge/Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Certification on AWS Architecture desirable - Worked in Offshore/Onsite Engagements - Experience in AWS services like STEP & Lambda - Project Management skills with consulting experience in Complex Program Delivery **Professional And Educational Background:** BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA **Minimum Years Experience Required:** Candidates with 8-12 years of hands-on experience **Additional Application Instructions:** Add here and change text color to black or remove bullet and section title if not applicable.,

Posted 4 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As an AWS Developer at PwC's Advisory Acceleration Center, you will collaborate with the Offshore Manager and Onsite Business Analyst to comprehend requirements and take charge of implementing Cloud data engineering solutions on AWS, such as Enterprise Data Lake and Data hub. With a focus on architecting and delivering scalable cloud-based enterprise data solutions, you will bring your expertise in end-to-end implementation of Cloud data engineering solutions using tools like Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java. Your responsibilities will include loading disparate data sets, translating complex requirements into detailed designs, and deploying Snowflake features like data sharing, events, and lake-house patterns. You are expected to possess a deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling, and demonstrate strong hands-on expertise in AWS services such as EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and AWS Streaming Services like Kinesis, SQS, and MSK. Troubleshooting and performance tuning experience in Spark framework, familiarity with flow tools like Airflow, Nifi, or Luigi, and proficiency in Application DevOps tools like Git, CI/CD frameworks, Jenkins, and Gitlab are essential for this role. Desired skills include experience in building stream-processing systems using solutions like Storm or Spark-Streaming, knowledge in Big Data ML toolkits such as Mahout, SparkML, or H2O, proficiency in Python, and exposure to Offshore/Onsite Engagements and AWS services like STEP & Lambda. Candidates with 2-4 years of hands-on experience in Cloud data engineering solutions, a professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA, and a passion for problem-solving and effective communication are encouraged to apply to be part of PwC's dynamic and inclusive work culture, where learning, growth, and excellence are at the core of our values. Join us at PwC, where you can make a difference today and shape the future tomorrow!,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As an AWS Developer at PwC's Acceleration Center in Bangalore, you will be responsible for the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. You will collaborate with Offshore Manager/Onsite Business Analyst to understand requirements and architect scalable, distributed, cloud-based enterprise data solutions. Your role will involve hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java. You must have a deep understanding of relational and NoSQL data stores, methods, and approaches such as star and snowflake dimensional modeling. Strong expertise in AWS services like EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and streaming services like Kinesis, SQS, and MSK is essential. Troubleshooting and performance tuning experience in Spark framework, along with knowledge of flow tools like Airflow, Nifi, or Luigi, is required. Experience with Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab is preferred. Familiarity with AWS CloudWatch, Cloud Trail, Account Config, Config Rules, and Cloud data migration processes is expected. Good analytical, problem-solving, communication, and presentation skills are essential for this role. Desired skills include building stream-processing systems using Storm or Spark-Streaming, experience in Big Data ML toolkits like Mahout, SparkML, or H2O, and knowledge of Python. Exposure to Offshore/Onsite Engagements and AWS services like STEP and Lambda would be a plus. Candidates with 2-4 years of hands-on experience in cloud data engineering solutions and a background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA are encouraged to apply. Travel to client locations may be required based on project needs. This position falls under the Advisory line of service and the Technology Consulting horizontal, with the designation of Associate based in Bangalore, India. If you are passionate about working in a high-performance culture that values diversity, inclusion, and professional development, PwC could be the ideal place for you to grow and excel in your career. Apply now to be part of a global team dedicated to solving important problems and making a positive impact on the world.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Senior AWS Data Engineer Cloud Data Platform at Teamware Solutions, a division of Quantum Leap Consulting Pvt. Ltd, located in Bangalore, you will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake and Data hub in AWS. Working onsite in an office environment for 5 days a week, you will collaborate with the Offshore Manager and Onsite Business Analyst to understand the requirements and deliver scalable, distributed, cloud-based enterprise data solutions. You should have a strong background in AWS cloud technology, with 4-8 years of hands-on experience. Proficiency in architecting and delivering highly scalable solutions is a must, along with expertise in Cloud data engineering solutions, Lambda or Kappa Architectures, Data Management concepts, and Data Modelling. You should be proficient in AWS services such as EMR, Glue, S3, Redshift, and DynamoDB, as well as have experience in Big Data frameworks like Hadoop and Spark. Additionally, you must have hands-on experience with AWS compute and storage services, AWS Streaming Services, troubleshooting and performance tuning in Spark framework, and knowledge of Application DevOps tools like Git and CI/CD Frameworks. Familiarity with AWS CloudWatch, Cloud Trail, Account Config, Config Rules, security, key management, data migration processes, and analytical skills is required. Good communication and presentation skills are essential for this role. Desired skills include experience in building stream-processing systems, Big Data ML toolkits, Python, Offshore/Onsite Engagements, flow tools like Airflow, Nifi or Luigi, and AWS services like STEP & Lambda. A professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA is preferred, and an AWS certified Data Engineer certification is recommended. If you are interested in this position and meet the qualifications mentioned above, please send your resume to netra.s@twsol.com.,

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Why this job matters Cloud Native Java Developer - To Individually contribute and Drive transformation of our existing Java microservices deployed on Amazon Elastic Kubernetes Service (EKS) to serverless AWS Lambda functions , below are the Roles and Responsibilities What youll be doing Key Responsibilities Develop and deploy serverless applications using Quarkus/Spring Boot and AWS Lambda Build RESTful APIs and event-driven microservices using cloud-native patterns Optimize cold-start performance using GraalVM native images Integrate with AWS services such as AWS API Gateway, S3, DynamoDB, CloudWatch and Postgres Implement and manage Lambda authorizers (custom and token-based) for securing APIs Design and configure AWS API Gateway for routing, throttling, and securing endpoints Integrate OAuth 2.0 authentication flows using Azure Active Directory as the identity provider Descent Understanding of resilience patterns Write unit and integration tests using JUnit, Mockito, and Quarkus testing tools Collaborate with DevOps teams to automate deployments using AWS SAM, CDK, or Terraform Monitor and troubleshoot production issues using AWS observability tools Migration Responsibilities Analyse existing Spring Boot microservices deployed on Kubernetes to identify candidates for serverless migration Refactor services to be stateless, event-driven, and optimized for short-lived execution Replace Kubernetes ingress and service discovery with API Gateway and Lambda triggers Migrate persistent state and configuration to AWS-native services (e.g., DynamoDB, S3, Secrets Manager) Redesign CI/CD pipelines to support serverless deployment workflows Ensure performance, cost-efficiency, and scalability in the new architecture Document migration strategies, patterns, and best practices for future reference Technical Proficiency Strong industry expereince of 4+ years with command of Java 8+, with deep understanding of: Functional interfaces (Function, Predicate, Supplier, Consumer) Streams API, lambda expressions, and Optional Proficiency in Java concurrency, including: Thread management, Executor Service, Completable Future, and parallel streams Designing thread-safe components and understanding concurrency pitfalls Understanding of AWS EKS (Elastic Kubernetes Service) , Docker Containers and Kubernetes fundamentals: Experience with resource requests and limits, pod autoscaling, and K8s networking Familiarity with transitioning workloads from EKS to serverless environments.

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Gurugram

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while also participating in the iterative process of application development to refine and enhance the solutions provided.You should have knowledge on PHP, Laravel, Drupal; HTML, CSS; SQL; Auth0, Terraform; AWS Basics, AWS DevOps, AWS SAM (Lambda); Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in application design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with user interface design and user experience best practices.- Familiarity with web development technologies such as HTML, CSS, and JavaScript.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Hyderabad

Work from Office

JD for Data Engineering Lead - Python: Data Engineering Lead with at least 7 to 10 years experience in Python with following AWS Services AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3, API Gateway, and CloudWatch. Providing architectural guidance to the offshore team,7-10, reviewing code and troubleshoot errors. Very strong SQL knowledge is a must, should be able to understand & build complex queries. Familiar with Gitlab( repos and CI/CD pipelines). He/she should be closely working with Virtusa onshore team as well as enterprise architect & other client teams at onsite as needed. Experience in API development using Python is a plus. Experience in building MDM solution is a plus.

Posted 2 months ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

JD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Candidate should have worked on LABMDA based APIs development. Should have experience in using following AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely working with enterprise architect & other client teams at onsite as needed. Having experience in building solutions using Kafka would be good value addition(optional).

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

________________________________________ Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant- Senior Data Engineer - Snowflake, AWS, Cortex AI & Horizon Catalog Role Summary: We are seeking an experienced Senior Data Engineer with deep expertise in modernizing Data & Analytics platforms on Snowflake, leveraging AWS services, Cortex AI, and Horizon Catalog for high-performance, AI-driven data management. The role involves designing scalable data architectures, integrating AI-powered automation, and optimizing data governance, lineage, and analytics frameworks. Key Responsibilities: . Architect & modernize enterprise Data & Analytics platforms on Snowflake, utilizing AWS, Cortex AI, and Horizon Catalog. . Design and optimize Snowflake-based Lakehouse architectures, integrating AWS services (S3, Redshift, Glue, Lambda, EMR, etc.). . Leverage Cortex AI for AI-driven data automation, predictive analytics, and workflow orchestration. . Implement Horizon Catalog for enhanced data lineage, governance, metadata management, and security. . Develop high-performance ETL/ELT pipelines, integrating Snowflake with AWS and AI-powered automation frameworks. . Utilize Snowflake&rsquos native capabilities like Snowpark, Streams, Tasks, and Dynamic Tables for real-time data processing. . Establish data quality automation, lineage tracking, and AI-enhanced data governance strategies. . Collaborate with data scientists, ML engineers, and business stakeholders to drive AI-led data initiatives. . Continuously evaluate emerging AI and cloud-based data engineering technologies to improve efficiency and innovation. Qualifications we seek in you! Minimum Qualifications . experience in Data Engineering, AI-powered automation, and cloud-based analytics. . Expertise in Snowflake (Warehousing, Snowpark, Streams, Tasks, Dynamic Tables). . Strong experience with AWS services (S3, Redshift, Glue, Lambda, EMR). . Deep understanding of Cortex AI for AI-driven data engineering automation. . Proficiency in Horizon Catalog for metadata management, lineage tracking, and data governance. . Advanced knowledge of SQL, Python, and Scala for large-scale data processing. . Experience in modernizing Data & Analytics platforms and migrating on-premises solutions to Snowflake. . Strong expertise in Data Quality, AI-driven Observability, and ModelOps for data workflows. . Familiarity with Vector Databases & Retrieval-Augmented Generation (RAG) architectures for AI-powered analytics. . Excellent leadership, problem-solving, and stakeholder collaboration skills. Preferred Skills: . Experience with Knowledge Graphs (Neo4J, TigerGraph) for structured enterprise data systems. . Exposure to Kubernetes, Terraform, and CI/CD pipelines for scalable cloud deployments. . Background in streaming technologies (Kafka, Kinesis, AWS MSK, Snowflake Snowpipe). Why Join Us . Lead Data & AI platform modernization initiatives using Snowflake, AWS, Cortex AI, and Horizon Catalog. . Work on cutting-edge AI-driven automation for cloud-native data architectures. . Competitive salary, career progression, and an opportunity to shape next-gen AI-powered data solutions. ________________________________________Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to brainstorm innovative solutions and ensure that the applications align with business objectives. Your role will also include reviewing design documents and providing feedback to enhance application performance and user experience, all while maintaining a focus on quality and efficiency. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Stencil.js, Vue.js; React; Python; Auth0, Terraform; Azure, Azure-ChatGPT; GenAI Basics; AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of web development principles and best practices.- Experience with content management systems and their implementation.- Familiarity with front-end technologies such as HTML, CSS, and JavaScript.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will also engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while continuously iterating on your designs based on feedback and testing outcomes. Your role will be pivotal in ensuring that the applications developed are user-friendly, efficient, and meet the highest standards of quality. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Vue.js; React; Python; GenAI Basics, AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and brainstorming sessions to foster innovative solutions.- Mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and data modeling.- Ability to create and maintain technical documentation. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies