Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
18 - 30 Lacs
bengaluru
Remote
Position Summary: The AWS Engineer is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform). Key Responsibilities: Architect, provision, and maintain AWS demo environments leveraging services such as EC2, ECS/Fargate, Lambda, S3, RDS, DynamoDB, IAM, KMS, API Gateway, CloudWatch, and AWS Glue. Build and maintain Terraform modules to automate environment provisioning, esets, and decommissioning. Integrate AWS GenAI solutions (Bedrock, SageMaker) for scenario automation, and real-time demo enrichment. Collaborate with Demo COE and Product SMEs to evaluate scenarios and implement custom flows. Implement RBAC, feature flag toggling, and configuration management via AWS Systems Manager and AppConfig. Ensure environmental health with automated monitoring, alerting, and load testing (CloudWatch, AWS Load Balancer, X-Ray). Support automated QA gates, metrics capture, and post-demo knowledge management. Provide secondary support for cloud engineering (AWS Glue, Redshift, Athena) for ingesting, transforming Contribute to hybrid cloud demos and integrations with Azure as needed. Qualifications: Minimum 5 years AWS engineering experience, with strong C# or Java development. Terraform expertise for AWS IaC. Experience with GenAI or ML services (Bedrock, SageMaker, Lambda). Professional certification in AWS is preferred
Posted 2 weeks ago
6.0 - 9.0 years
30 - 32 Lacs
coimbatore
Work from Office
Job Summary: We are seeking a skilled Cloud Migration Consultant with hands-on experience in assessing and migrating complex applications from AWS to Azure. The ideal candidate will work closely with Microsoft business units, participating in application intake, assessment, and migration planning. This role includes creating migration artifacts, leading client interactions, and supporting application modernization initiatives on Azure with occasional AWS exposure. Key Responsibilities: Assess application readiness by documenting architecture, dependencies, and migration strategies. Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, and PowerShell. Develop architecture diagrams, migration playbooks, and manage Azure DevOps boards. Set up and configure applications in on-premises and cloud environments, primarily Azure. Support proof-of-concepts (PoCs) and provide expert advice on migration and modernization options. Collaborate with application, database, and infrastructure teams to ensure smooth transition to migration factory teams. Track project progress, identify blockers and risks, and report timely status updates to leadership. Required Skills and Qualifications: Minimum 4 years of experience in cloud migration and application assessment. Strong expertise in Azure IaaS and PaaS services (e.g., VMs, App Services, Azure Data Factory). Familiarity with AWS IaaS and PaaS components (e.g., EC2, RDS, Glue, S3). Proficient in programming languages and frameworks including Java (Spring Boot), C#, .NET, Python, Angular, React.js, and REST APIs. Working knowledge of Kafka, Docker, Kubernetes, and Azure DevOps. Solid understanding of network infrastructure including VNets, NSGs, Firewalls, and WAFs. Experience with IAM concepts and technologies such as OAuth, SAML, Okta, and SiteMinder. Exposure to Big Data technologies like Databricks, Hadoop, Oracle, and DocumentDB. Preferred Qualifications: Azure or AWS cloud certifications. Prior experience with enterprise-scale cloud migration projects, especially within the Microsoft ecosystem. Excellent communication skills and proven ability to manage stakeholder relationships effectively. Location : Hyderabad/ Bangalore/ Coimbatore/ Pune
Posted 3 weeks ago
5.0 - 9.0 years
6 - 9 Lacs
kolkata
Work from Office
Were Hiring: Lead AI Engineer Experience:5+ Years Experience Location: Kolkata Company: Decorpot Key Skills & Expertise: * AI Frameworks:TensorFlow, PyTorch, TensorFlow\.js, Hugging Face * 3D Integration:Three.js, Babylon.js, WebGL * Cloud Platforms:AWS (SageMaker, S3), Google Cloud AI, Azure ML * Version Control:Git (GitHub/GitLab) * Collaboration Tools:JIRA, Slack, Trello (Agile workflows) * Testing:Postman (API), Selenium (Web apps) Apply Now: perambhargav.reddy@decorpot.com
Posted 3 weeks ago
3.0 - 8.0 years
8 - 16 Lacs
chennai
Work from Office
Strong Experience in handling / managing Jenkins pipeline deployments and troubleshooting plugin and build failures. Strong experience in Gitlab CI / template creations in a must Strong Linux Administration Knowledge and experience Good understanding of source code tools like Gitlab, Github etc Experience in handling gtilab runner setup and jenkins cloud k8s node setup is must. Good foundational AWS knowledge with basic services like EC2, EBS, S3, RDS etc Experience in fortify, blackduck, sonarqube scans and regression / unit testing pipelines is mandatory Basic understanding of scripting (ansible / python) is needed. CICD Administrative/Analytical experience in handling continuous deployment is a must Knowledge on handling Jenkins in container platform environment would be added advantage Knowledge in Jenkins administration, patch updates and HA/DR exercises would be an added advantage Good understanding of argoCD is an added advantage Good knowledge of k8s, docker is an added advantage Good knowledge on gitlab architecture components like redis cache, ngnix, puma, workhorse, sidekiq etc would be added advantage
Posted 3 weeks ago
3.0 - 5.0 years
10 - 13 Lacs
pune, mumbai (all areas)
Work from Office
Responsibilities/ Duties •Product collaboration: Translate product needs into clear problem statements, evaluation plans, and straightforward solution flows; contribute to UI/UX considerations for model inputs/outputs and edge cases. •Applied ML: Build and iterate on models for logistics use cases (e.g., chatbot, forecasting, ETA, anomaly detection, recommendations, document parsing), favoring simple, reliable approaches first. •Data preparation: Explore and prepare data from MongoDB Atlas, Kafka/MSK streams, and S3; implement reproducible feature pipelines and basic data quality checks. •Evaluation & rollout: Define metrics and baselines; run offline evaluations and support phased/A-B rollouts with clear success criteria. •Integration: Package models as small services or batch jobs; collaborate with Backend/Frontend to design APIs/GraphQL contracts, payloads, and error handling that fit our core platform. •DevOps collaboration: Request/provision suitable compute (EC2; occasional GPU), containerize jobs (Docker), set up light scheduling (EventBridge/Step Functions or Airflow if used), and add basic monitoring/alerts. JD Data Scientist (AI_ML & Analytics) •Analytics & insights: Perform focused analyses to support product decisions and present findings with clear visuals and concise summaries. •Quality & governance: Follow coding standards; version datasets/models where practical; respect access controls for customer data; keep documentation current (problem briefs, data lineage, model cards, runbooks). Criteria for the Role! •3–5 years in applied machine learning with at least one end-to-end production delivery. •Strong Python (pandas, numpy, scikit-learn; plus XGBoost/LightGBM). •Comfortable with SQL and working with MongoDB/event data (Kafka/MSK). •Experience deploying models as APIs or batch jobs (Docker; Git-based workflows); familiarity with CI/CD concepts. •Understanding of evaluation design for classification/regression/time-series; pragmatic feature engineering. •Clear communication and a collaborative, documentation-friendly approach.
Posted 3 weeks ago
6.0 - 9.0 years
30 - 32 Lacs
hyderabad, pune, bengaluru
Work from Office
Job Summary: We are seeking a skilled Cloud Migration Consultant with hands-on experience in assessing and migrating complex applications from AWS to Azure. The ideal candidate will work closely with Microsoft business units, participating in application intake, assessment, and migration planning. This role includes creating migration artifacts, leading client interactions, and supporting application modernization initiatives on Azure with occasional AWS exposure. Key Responsibilities: Assess application readiness by documenting architecture, dependencies, and migration strategies. Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, and PowerShell. Develop architecture diagrams, migration playbooks, and manage Azure DevOps boards. Set up and configure applications in on-premises and cloud environments, primarily Azure. Support proof-of-concepts (PoCs) and provide expert advice on migration and modernization options. Collaborate with application, database, and infrastructure teams to ensure smooth transition to migration factory teams. Track project progress, identify blockers and risks, and report timely status updates to leadership. Required Skills and Qualifications: Minimum 4 years of experience in cloud migration and application assessment. Strong expertise in Azure IaaS and PaaS services (e.g., VMs, App Services, Azure Data Factory). Familiarity with AWS IaaS and PaaS components (e.g., EC2, RDS, Glue, S3). Proficient in programming languages and frameworks including Java (Spring Boot), C#, .NET, Python, Angular, React.js, and REST APIs. Working knowledge of Kafka, Docker, Kubernetes, and Azure DevOps. Solid understanding of network infrastructure including VNets, NSGs, Firewalls, and WAFs. Experience with IAM concepts and technologies such as OAuth, SAML, Okta, and SiteMinder. Exposure to Big Data technologies like Databricks, Hadoop, Oracle, and DocumentDB. Preferred Qualifications: Azure or AWS cloud certifications. Prior experience with enterprise-scale cloud migration projects, especially within the Microsoft ecosystem. Excellent communication skills and proven ability to manage stakeholder relationships effectively. Location : Hyderabad/ Bangalore/ Coimbatore/ Pune
Posted 3 weeks ago
6.0 - 9.0 years
10 - 18 Lacs
pune
Hybrid
Job Description: Experience :6 - 9 yrs Role & responsibilities Responsible for building agentic workflows using modern LLM orchestration framework to automate and optimize xomplex business process in the Travel domain. Individual contributor (IC), owning end to end development of intelligent agents and services that power customer experiences, recommendations and backend automation. Design and implement agentic and autonomous workflows using framework such as LangGraph, LangChain and CrewAI. Translate business problems in the Travel domain into intelligent LLM powered workflows. Own at least two AI use case implementation from design to production deployment. Build and expose RESTFul and GraphQL APIs to support internal and external consumers. Develop and maintain robust Python based microservices using FastAPI or Django. Collaborate with product managers, data engineers and backend teams to design seamless AI driven user experience. Deploy and maintain workflow and APIs on AWS with best practices in scalability and security. Nice to have Experience in a Big Data technologies (Hadoop, Terradata, Snowflake, Spark, Redshift, Kafka, etc.) for Data Processing. Experience with data management process on AWS is a huge Plus. AWS certification Hand on experience building applications with LangGraph, LangChain and CrewAI. Experience working with AWS services - Lambda, API Gateway, S3, ECS, DynamoDB Experience and Proven track record of implementing at least two AI / LLM based use cases in production. Strong problem solving skills with the ability to deconstract complex problems into actionable AI workflows Experience building scalable, production-grade APIs using FastAPI or Django Strong command over Python and software engineering best practices Solid understanding of multithreading, IO operations, and scalability patterns in backend systems.
Posted 3 weeks ago
7.0 - 12.0 years
15 - 18 Lacs
mumbai, delhi / ncr, bengaluru
Work from Office
About Role: We are seeking an experienced AWS Cloud Architect to design, implement, and optimize our cloud infrastructure and data platform solutions. The ideal candidate will bring deep expertise in AWS services, Infrastructure as Code (IaC), security, networking, and data platforms, with a proven track record in delivering large-scale, reliable, and secure cloud environments. Roles & Responsibilities Design and implement secure, scalable AWS cloud architectures (compute, storage, networking, data, CI/CD). Develop Infrastructure as Code (Terraform/CloudFormation/CDK) and automated provisioning pipelines. Ensure security, compliance, and governance across AWS environments. Optimize cloud costs, performance, and scalability (autoscaling, caching, multi-AZ/region). Architect data platforms with Kafka, S3, RDS, DynamoDB, and analytics integrations. Define and implement HA, DR, backup, and recovery strategies. Establish observability frameworks (CloudWatch, X-Ray, logging, alerting). Lead CI/CD pipelines, release management, and deployment strategies (blue/green, canary). Collaborate with developers and stakeholders to deliver reliable, high-performing solutions. Required Skills & Qualifications Proven experience architecting AWS solutions at enterprise scale. Expertise with AWS services: EC2, ECS/EKS, Lambda, VPC, RDS, S3, Kinesis/Kafka, IAM, CloudWatch, CloudTrail, Route53, ALB/NLB. Strong skills in Infrastructure as Code (Terraform, CloudFormation, CDK). Experience with CI/CD pipelines and container orchestration (EKS or ECS/Fargate). Strong background in security, compliance, and identity management. Data platform experience with Kafka, RDS/Postgres, DynamoDB, data lakes. Excellent documentation and architecture diagramming skills (ArchiMate preferred). Certifications (Preferred) AWS Certified Solutions Architect Professional (or equivalent experience). Other relevant AWS specialty certifications (Networking, Security, Data Analytics) are a plus. Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Remote
Posted 3 weeks ago
5.0 - 10.0 years
16 - 20 Lacs
mumbai, goregaon
Work from Office
Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity
Posted 3 weeks ago
7.0 - 9.0 years
30 - 32 Lacs
chennai, bengaluru
Work from Office
Hiring Cloud Engineers for an 8-month contract role based in Chennai or Bangalore with hybrid/remote flexibility. The ideal candidate will have 8+ years of IT experience, including 4+ years in AWS cloud migrations, with strong hands-on expertise in AWS MGN, EC2, EKS, Terraform, and scripting using Python or Shell. Responsibilities include leading lift-and-shift migrations, automating infrastructure, migrating storage to EBS, S3, EFS, and modernizing legacy applications. AWS/Terraform certifications and experience in monolithic and microservices architectures are preferred Cloud Engineer, AWS Migration, AWS MGN
Posted 3 weeks ago
5.0 - 8.0 years
8 - 10 Lacs
hyderabad, bengaluru, delhi / ncr
Work from Office
We are seeking a skilled Application Support Engineer to manage and support multiple invoice processing and development projects. The ideal candidate should have experience in production support, issue resolution, process automation, and working with tools like Jiffy and Python scripting. Responsibilities: Provide end-to-end support including: Running scheduled tasks based on received files Issue resolution and root cause analysis Handling new change requests and enhancements Support and development in DEV environments for document and invoice processing (VOLVO & DXC projects) Ongoing support and development for Gatwick application Collaborate with cross-functional teams to ensure smooth delivery of features and fixes Utilize Jiffy automation tool for process automation and monitoring Write and maintain scripts in Python to automate repetitive support tasks Document processes and provide clear communication on resolutions and updates Keyskills: Experience in unit testing, distributed systems, and API development is essential. Requirements * Minimum 8 years of hands-on development experience in a role * Bachelor degree in Information Technology, Computer Science, Software Engineering, or a related field * Very good level of English communication (spoken and written) * Ability to work Indian Standard Time (IST) hours, with flexibility to adjust start times during onboarding to overlap with Australian business hours for the first week * Ability to navigate ambiguity and deliver in a complex environment * Strong collaborative and leadership skills Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
6.0 - 8.0 years
10 - 20 Lacs
pune
Hybrid
JD: 6+ years of experience in data engineering, specifically in cloud environments like AWS. Dont Not Share Data Science Profiles. Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills : Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2. Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
10.0 - 17.0 years
15 - 30 Lacs
visakhapatnam
Hybrid
Key Responsibilities Lead and mentor engineering teams, fostering best practices in Cloud architecture and engineering Design and implement secure, scalable and robust cloud infrastructure and platform services on AWS and Azure, adhering to Cloud and industry best practices Develop and maintain infrastructure as code (IaC) for the Cloud ecosystem using tools like AWS CloudFormation, Terraform, ARM, Chef and Ansible Establishes Architecture technical design, engineering, and documentation standards as a reference for all new systems implementations and workload migrations Hands-on skills with Cloud computing, containers/orchestration, auto scaling, CI/CD and pipeline optimization Implement observability frameworks , including monitoring, logging, and alerting, to ensure system reliability and performance Enforce security best practices , covering IAM policies, encryption, network segmentation, and compliance with standards such as GxP, CIS, NIST, HIPAA, and ISO Collaborate with stakeholders across R&D, clinical, regulatory, and manufacturing to understand business needs and translate them into technical solutions Drive cost and performance optimization across various Cloud environments, ensuring efficient use of resources
Posted 3 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
coimbatore
Hybrid
Strong JAVA8/17 hands on .. 1. Java Full stack (Strong JAVA Backend Development with Strong Microservices Development experience , Basic hands on experience on Angular or React ) 2. JAVA Full stack (Strong JAVA BE Development with Strong Microservices Development experience , Strong Angular/React UI hands on experience) / Java cloud - AWS ( Lambda, EC2, S3) 3. JAVA Backend Developer ( Strong Core JAVA (JAVA, Collections, Spring Core, Spring JDBC, JPA) Development with Medium hands on Microservices Development experience) 4. Min: 6 yrs and Max:13 yrs Notice: Immediate joiners - 30 days Location: Coimbatore Interview: Virtual (2 rounds) interested can share the profile in email: brindag@hexaware.com Pls mention the Subject: "Profile for Java Lead" Please Mention the below : Total Exp: Current & Preferred Location: Current & Expected CTC: Notice period & Last working date: Education: any offer holding & details: current company: Skills - Primary & secondary:
Posted 3 weeks ago
10.0 - 14.0 years
27 - 30 Lacs
nagpur, pune
Work from Office
Skills Required: Strong expertise in Java, Spring Boot, Microservices Hands-on experience with AWS (EKS, EC2, S3, Lambda, RDS) Proficient in REST APIs, Kafka, SQL/NoSQL Databases Excellent problem-solving & communication skills If interested, kindly share your CV at [mayuri.jain@apex1team]
Posted 3 weeks ago
8.0 - 12.0 years
30 - 35 Lacs
pune
Work from Office
Job Summary Zywave is looking for an experienced and hands-on Technical Lead with deep expertise in Cube.js , Chart.js , .NET Core , and AWS to lead the development of intelligent and scalable data reporting solutions. The ideal candidate will be passionate about building real-time dashboards, integrating AI-enhanced insights, and mentoring a team in delivering impactful visual analytics for the insurance domain . Key Responsibilities Lead the design and development of interactive dashboards using Cube.js and Chart.js . Architect backend services and APIs with .NET Core to power analytics and reporting layers. Collaborate with AI/ML teams to integrate machine learning models into reporting workflows. Oversee the deployment and management of data reporting infrastructure on AWS , ensuring performance, scalability, and cost-efficiency. Work closely with cross-functional teamsdata engineers, analysts, and product stakeholdersto define and deliver business-focused visualizations. Drive optimization of data querying, caching, and rendering for real-time and historical insights. Review code, enforce best practices, and mentor junior and mid-level engineers. Champion continuous improvements in performance, reliability, and security of reporting systems. Keep pace with the latest advancements in data visualization, cloud platforms, and AI technologies. Qualifications Bachelors or Master’s degree in Computer Science, Engineering, or a related field. 8+ years of software development experience, with at least 3 years in a technical leadership role . Proven hands-on experience with Cube.js , Chart.js , and .NET Core . Strong knowledge of AWS services (e.g., Lambda, API Gateway, S3, RDS). Expertise in SQL , RESTful API development , and data modeling . Experience integrating AI/ML models into production-level applications or dashboards. Familiarity with frontend frameworks like React or Vue.js for embedding visual content. Understanding of DevOps practices , CI/CD pipelines , and code quality standards. Prior experience in the insurance domain is a strong plus. Mandatory Skills Git .NET Core SQL Good-to-Have Skills Cube.js Chart.js RESTful APIs Data Modeling AWS Prompt Engineering Domain Experience Prior exposure to the insurance domain is preferred. Work Mode: 5 Days Work from Office
Posted 3 weeks ago
7.0 - 12.0 years
10 - 15 Lacs
hyderabad
Work from Office
We are seeking a Lead/Senior Data Engineer with 7-12 years of experience to architect, develop, and optimize data solutions in a cloud-native environment. The role requires strong expertise in AWS Glue, PySpark, and Python with a proven ability to design scalable data pipelines and frameworks for large-scale enterprise systems. Prior exposure to financial services or regulated environments is a strong advantage. Key Responsibilities Design and implement secure, scalable pipelines using AWS Glue, PySpark, EMR, S3, Lambda, and other AWS services. Lead ETL development for structured and semi-structured data, ensuring high performance and reliability. Build reusable frameworks, automation tools, and CI/CD pipelines with AWS CodePipeline, Jenkins, or GitLab. Mentor junior engineers, conduct code reviews, and enforce best practices. Implement data governance practices including quality, lineage, and compliance standards. Collaborate with product, analytics, compliance, and DevOps teams to align technical solutions with business goals. Optimize workflows for cost efficiency, scalability, and speed. Prepare technical documentation and present architectural solutions to stakeholders. Requirements Strong hands-on experience with AWS Glue, PySpark, Python, and AWS services (EMR, S3, Lambda, Redshift, Athena). Proficiency in ETL workflows, Airflow (or equivalent), and DevOps practices. Solid knowledge of data governance, lineage, and agile methodologies. Excellent communication and stakeholder engagement skills. Financial services or regulated environment background preferred.
Posted 3 weeks ago
5.0 - 7.0 years
0 - 3 Lacs
bengaluru
Work from Office
Key Responsibilities • Architect and manage AWS services (EC2, S3, IAM, CloudFront, EKS/ECS) to deliver secure a scalable environments. • Design, build, and maintain CI/CD pipelines (GitLab, GitHub Actions, Jenkins) to enable reliable and efficient deployments. • Manage containerized workloads using Docker and Kubernetes (EKS/ECS). • Implement Infrastructure as Code (Terraform/CloudFormation) and drive automation across environments. • Troubleshoot infrastructure and application issues in real-time, ensuring minimal downtime and fast recovery. • Monitor and optimize system performance, availability, and cost using tools like Datadog, Prometheus, and CloudWatch. • Work with relational and non-relational databases, including MySQL, MongoDB, and Redis. • Integrate and maintain messaging queues with RabbitMQ/AMQP. • Collaborate with developers, QA, and product teams to deliver features reliably and quickly. • Ensure security best practices across cloud environments and CI/CD pipelines. Qualifications • Bachelors degree in Computer Science, Information Technology, or related field. • Proven experience as a DevOps Engineer or similar role. • Strong expertise in AWS (EC2, S3, IAM, CloudFront, EKS, ECS); certification preferred. • Proficiency in scripting languages (Python, Bash, PowerShell). • Hands-on with Docker, Kubernetes, and Infrastructure as Code (Terraform/CloudFormation). • Experience with CI/CD tools like GitLab, GitHub Actions, and Jenkins. • Knowledge of networking, security principles, and DevOps best practices. • Strong problem-solving and troubleshooting skills
Posted 3 weeks ago
6.0 - 11.0 years
0 - 0 Lacs
chennai
Hybrid
Job details: Title AWS Data Engineer Type Hybrid Location Chennai Key Skills AWS Glue, RedShift, S3, Lambda Athena. Hands on experience in Data Engineer with AWS, Glue, Lambda, SQL, Python, Redshift. Must have working knowledge in designing and implementing data pipelines on any of the cloud providers (AWS is preferred). Must be able to work with large volumes of data coming from various sources. Perform data cleansing, data validation etc Hands on ETL developer who is good at python, SQL. AWS services like glue, glue crawlers, lambda, red shift, athena, s3, EC2, IAM, Monitoring and Logging mechanisms- AWS cloudwatch, setting up alerts. Deployment knowledge on cloud. Integrate CI/CD pipeline to build artifacts and deploy changed to higher Environments. Scheduling frame works Airflow, AWS Step functions Excellent Communication skills, should be able to work collaboratively with other teams
Posted 3 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
chennai
Work from Office
Hiring For Top IT Company- Designation: AWS Data Engineer Skills: AWS, Glue, Lambda, SQL, Python, Redshift,S3. Loc: Chennai Exp:5+ yrs Call: Akshita-9785478741 Surbhi :8058357806 Ambika : 9672301543 Thanks, Team Converse
Posted 3 weeks ago
3.0 - 5.0 years
7 - 14 Lacs
bengaluru
Work from Office
We are seeking a talented DevOps Engineer with expertise in the technical skills outlined in the provided document. Below are the updated responsibilities and qualifications tailored to this role. Responsibilities: Design, deploy, and manage scalable AWS infrastructure utilizing services such as EC2, S3, RDS, Lambda, CloudFormation, IAM, VPC, Load Balancers, and Auto Scaling. Build and maintain robust CI/CD pipelines using Jenkins and AWS CodePipeline to streamline development and deployment processes. Implement and manage configuration management tools, including Ansible, Chef, and Puppet, to ensure system consistency and automation. Deploy and orchestrate containerized applications using Docker, Kubernetes, and Amazon ECS. Maintain and administer version control systems, primarily Git and GitHub. Develop and manage Infrastructure as Code (IaC) solutions using Terraform to automate provisioning and resource management. Write and maintain automation scripts in Python, Bash, and PowerShell to support operational workflows. Configure and manage networking components such as VPCs, subnets, security groups, and Route 53 for DNS management. Implement and oversee monitoring and logging solutions using AWS CloudWatch, the ELK Stack (Elasticsearch, Logstash, Kibana), and Prometheus. Administer various operating systems, including Linux, Ubuntu, and Windows, ensuring optimal performance and security. Configure and manage web servers, including Apache Tomcat and Nginx, for application hosting and performance optimization. Criteria: Proven hands-on experience with a broad range of AWS services, including EC2, S3, RDS, Lambda, CloudFormation, IAM, VPC, Load Balancers, and Auto Scaling. Proficient in implementing and managing CI/CD pipelines using tools like Jenkins and AWS CodePipeline. Strong background in configuration management with tools such as Ansible, Chef, and Puppet. In-depth knowledge and practical experience with containerization and orchestration technologies, including Docker, Kubernetes, and Amazon ECS. Skilled in version control systems, particularly Git and GitHub. Extensive experience in developing and managing Infrastructure as Code (IaC) using Terraform. Advanced scripting abilities in Python, Bash, and PowerShell for automation and system management. Solid understanding of networking fundamentals and hands-on experience configuring VPCs, subnets, security groups, and DNS management with Route 53. Familiarity with monitoring and logging tools such as AWS CloudWatch, ELK Stack (Elasticsearch, Logstash, Kibana), and Prometheus. Experience managing and troubleshooting across multiple operating systems, including Linux, Ubuntu, and Windows. Competency in configuring and administering web servers such as Apache Tomcat and Nginx. Experience 3 5 years
Posted 3 weeks ago
4.0 - 9.0 years
8 - 18 Lacs
bengaluru
Work from Office
Job Requirements Mandatory Skills Bachelors degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline 7+ years of relevant experience in data engineering roles Detailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures and hands-on SQL coding Proficient in at least one or more programming languages: Java, Python, Ruby, Scala Experienced with AWS services such as Redshift, S3, EC2, Lambda, Athena, EMR, AWS Glue, Datapipeline. Exposure to data visualization and reporting with tools such as Amazon QuickSight, Metabase, Tableau, or similar software Experience building metrics deck and dashboards for KPIs including the underlying data models. Understand how to design, implement, and maintain a platform providing secured access to large datasets Primary Roles and Responsibilities An AWS Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using AWS cloud services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The AWS Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. Preferred Skills Masters degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline 7+ years of work experience with ETL, Data Modelling, and Data Architecture. Experience or familiarity with newer analytics tools such as AWS Lake Formation, Sagemaker, DynamoDB, Lambda, ElasticSearch. Experience with Data streaming service e.g Kinesis Kafka Ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to accurately determine cause and effect relations Proven track record partnering with business owners to understand requirements and developing analysis to solve their business problems Proven analytical and quantitative ability and a passion for enabling customers to use data and metrics to back up assumptions, develop business cases, and complete root cause analysis
Posted 3 weeks ago
3.0 - 8.0 years
22 - 32 Lacs
pune
Work from Office
Experience 2-5 years (P2)- 15 LPA Experience -5-8 years (P3)- 25 LPA Experience -8 -10 years (P4)- 32 LPA NP--Immediate--15 Days Must Skills - Devops + AWS+ Maven+ Helm+Jenkins+Shell Scripting+Kubernetes [Primary responsibility of candidate]: Involved in supporting infrastructure architecture, system performance and overall infrastructure operating environment for either on-premise infrastructure or cloud computing platforms, or both. Support new application system initiatives and to drive development of infrastructure architecture, system integration, acceptance, performance management practice and performance testing Run infrastructure services either on premise, hybrid or public cloud environment to support application systems by working closely with Applications Service counterparts Understand systems operations environment and drive Architecture Review and Governance Process to ensure smooth and sustained operations (including resiliency requirements). [Primary responsibility of candidate]: Involved in supporting infrastructure architecture, system performance and overall infrastructure operating environment for either on-premise infrastructure or cloud computing platforms, or both. Support new application system initiatives and to drive development of infrastructure architecture, system integration, acceptance, performance management practice and performance testing Run infrastructure services either on premise, hybrid or public cloud environment to support application systems by working closely with Applications Service counterparts Understand systems operations environment and drive Architecture Review and Governance Process to ensure smooth and sustained operations (including resiliency requirements). [What is project requirement to qualify candidates]: Degree in Computer Science, Computer or Electronics Engineering or Information Technology or equivalent Minimum 5 years of relevant working experience, with validated records of having utilized architect design capabilities in infrastructure management for both on premise and Cloud workloads. Certification in a Cloud Technology platform (either in Architecture, DevOps or System Administration/ SysOps track) preferred. Successful candidate need to demonstrate either deep or broad (or both) level of technical expertise in the area of Infrastructure Services, with appreciation in one or more areas of Infrastructure Management, Cloud Computing and DevOps Engineering concepts. Proactive and dedicated individual with good leadership and multi-tasking capabilities Good interpersonal skills, oral and written skills, with the ability to present ideas and influence partners of different level [Candidate's Tech Stack]: AWS - EC2, S3, VPC, EKS, Lambda, CloudWatch, Transit Gateway, Network Firewall, IAM, Transfer Family IAC - Terraform Kubernetes Helm Grafana & Prometheus
Posted 3 weeks ago
2.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
Experian is a global data and technology company that powers opportunities for individuals and businesses worldwide. Our unique combination of data, analytics, and software allows us to redefine lending practices, prevent fraud, simplify healthcare, create marketing solutions, and provide deeper insights into the automotive market. With a presence in various markets such as financial services, healthcare, automotive, agribusiness, and insurance, we assist millions of individuals in achieving their financial goals while saving them time and money. As part of our commitment to unlocking the power of data, we are looking for a skilled professional to join our team as a Business Intelligence Developer. In this role, you will be responsible for developing client-facing standardized reports using Business Intelligence tools like Sigma and Tableau. You will conduct data profiling on source data with minimal documentation, troubleshoot data independently, perform detailed data analyses, and develop complex SQL code. Additionally, you will write secure, stable, testable, and maintainable Python code with minimal defects. Your key responsibilities will include performing root cause analysis, proposing solutions, and taking ownership of the next steps for issue resolution. You will create and maintain report specifications and process documentations as part of the required data deliverables. Collaborating with Engineering teams, you will discover and leverage data introduced into the environment and serve as a liaison between business and technical teams to achieve project objectives by delivering cross-functional reporting solutions. To be successful in this role, you must have a minimum of 7 years of experience in BI visualization development and support, along with 2 years of experience in Sigma report development and support, 2 years of experience in Tableau Server administration, 3 years of experience with the AWS data ecosystem (Redshift, S3, etc.), and 2 years of experience with Python. Experience in an Agile environment and familiarity with MWAA and Business Objects are considered advantageous. Moreover, you should possess excellent customer-facing communication skills, be a highly motivated self-starter, detail-oriented, and able to work independently to formulate innovative solutions. If you are someone who can multitask and prioritize an evolving workload in a fast-paced environment and provide on-call production support, we encourage you to apply. A BS degree or higher in MIS or engineering fields is required for this position. Join Experian, where we celebrate uniqueness and prioritize our people. Our culture and focus on DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, and volunteering have earned us accolades such as World's Best Workplaces 2024, Great Place To Work in 24 countries, and Glassdoor Best Places to Work 2024. Find out more about Experian Life on social or our Careers Site to understand why we are dedicated to creating a better tomorrow together.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
The ideal candidate for the Data QA position should have 6 to 10 years of experience in ETL+ DWH Testing and Cloud Testing with expertise in AWS (S3 & Glue) and Azure, along with proficiency in SQL Server, including writing SQL queries. Additionally, familiarity with Snowflake, ADF, and API Testing is essential. Key technical skills required for this role include a strong understanding of Boomi or similar integration tools like MuleSoft, Informatica, Talend, or Workato, experience with Data Bricks, in-depth knowledge of ETL Testing, proficiency in writing SQL Queries, and good technical documentation skills. The candidate should also possess strong communication skills to effectively interact with both technical and non-technical stakeholders. Desirable skills for this role include Data Testing on AWS using Athena, the ability to run and troubleshoot AWS Glue Jobs, understanding of Data Warehousing concepts, and proficiency in Python and Apache Spark. The working hours for this position are from 11:00 am to 8:00 pm, with flexibility required for potential overlap with EST as per project requirements. This is a full-time position with a duration of 1 year, and the candidate is expected to work 2/3 days from the office in Bangalore, Pune, Mumbai, Hyderabad, or Noida, based on the hybrid model.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |