Jobs
Interviews

303 Aws Services Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

9 - 19 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Technical Aspects: 1)Proficiency in Designing, Development, Testing and Deploying Node.JS applications in AWS.2)Primary Skill: Node.JS, Secondary: AWS. 3)Should have at least 5+ years of Node.JS Experience and its frameworks(Express.JS). 4)Should have at least 3+ years of AWS Experience. 5)Good knowledge on Micro-Services Architecture. 6)Good knowledge on JavaScript advanced concepts and ES6+ features. 7)Thorough understanding of Relational Databases and No-SQL Databases. 8)Excellent coding skills in Node.JS with JavaScript with good debugging techniques. 9)Excellent logical and problem-solving skills. 10)Understanding of API architecture styles and its best security practices. 11)Experience in writing unit test cases for Backend API components using Mocha or Jest frameworks. 12)Knowledge on Docker operations like Image creation, Image push to repository, containerization and etc. 13)Understanding of CI/CD pipelines process. 14)Understanding of core AWS services, uses, and basic AWS architecture best practices. 15)Proficiency in developing, deploying, and debugging cloud-based applications using AWS. 16)Should have knowledge on AWS services like AWS SNS, SQS, CloudTrail, Load Balancers, Route53. 17)Proficiency in AWS Services like ECS,EC2,S3,Lambda, API Gateway, Cloud Watch, Cloud Formation, Dynamo DB, App Config, secret Manager. 18)Ability to rapidly learn and take advantage of new concepts, business models, and technologies. 19)Working experience on dev and support tools like Jira, Confluence, ServcieNow. Role / Responsibilities / soft skills: 1)Working experience in the dev/support projects following the Agile methodology 2)Work experience with US based client in client facing abilities will be preferred 3)Excellent communication, analytical and team leading abilities 4)Proactively identify recurring issues and suggest improvements to products, services, or processes to enhance customer satisfaction. 5)Provide prompt and effective technical support to customers via various channels such as email or ticketing system 6)Collaborate with cross-functional teams including development, DevOps, Business and QA to resolve complex technical issues. 7)Documenting technical design documents, troubleshooting steps, solutions, and best practices for internal and external knowledge sharing.

Posted 1 month ago

Apply

12.0 - 14.0 years

14 - 18 Lacs

Hyderabad, Gurugram, Chennai

Work from Office

About the Role: Grade Level (for internal use): 11 The Team: The Infrastructure team is a global team split across the US, Canada and the UK. The team is responsible for building and maintaining platforms used by Index Management teams to calculate and rebalance our high profile indices. The Impact: You will be responsible for the development and expansion of platforms which calculate and rebalance indices for S&P Dow Jones Indices. This will ensure that relevant teams have continuous access to up to date benchmarks and indices. Whats in it for you: In this role, you will be a key player in the Infrastructure Engineering team where you will manage the automation of systems administration in the AWS Cloud environment used for running index applications. You will build solutions to automate resource provisioning and administration of infrastructure in AWS Cloud for our index applications. There will also be a smaller element of L3 support for Developers when they have more complex queries to address. Responsibilities: Create DevOps pipelines to deliver Infrastructure as Code. Build workflows to create immutable Infrastructure in AWS . Develop automation for provisioning compute instances and storage. Build AMI images for the cloud Develop Ansible playbooks and automate execution of routine Linux scripts Provision resources in AWS using Cloud Formation Templates Deploy immutable infrastructure in AWS using Cloud formation, Ansible, Python etc., Orchestrate container deployment Configure Security Groups, Roles & IAM Policy in AWS Monitor infrastructure and develop utilization reports Implementing and maintaining version control systems, configuration management tools, and other DevOps-related technologies Designing and implementing automation tools and frameworks for continuous integration, delivery, and deployment Develop and write scripts for pipeline automation using relevant scripting languages like Groovy, YAML Configure continuous delivery workflows for various environments e.g., development, staging, production. Use Jenkins to create pipelines , for infrastructure deployment and management in the cloud Evaluate new AWS services and solutions. Focus on IAC , build reusable workflows for infrastructure in the cloud Troubleshoot Production issues in AWS and infrastructure pipelines Remediate violations on IAC Develop IAC which meets industry security standards Effectively interact with global customers, business users and IT employees Basic Qualifications: Bachelor's degree in Computer Science, Information Systems or Engineering or equivalent qualification is preferred or relevant equivalent work experience RedHat Linux & AWS Certifications preferred. Strong experience in Infrastructure Engineering and automation. Very good experience in AWS Cloud systems administration. Experience in developing Ansible scripts and Jenkins integration. Expertise using DevOps tools (Jenkins, Python, Boto, Ansible, GitHub, Artifactory) Expertise in the different automation tools used to develop CI/CD pipelines Proficiency in Jenkins and Groovy for creating dynamic and responsive CI/CD pipelines Good experience in RedHat Linux scripting First class communication skills written, verbal and presenting Preferred Qualifications: Candidates should have a minimum of 12 years industry experience in cloud and Infrastructure. Administer Redhat Linux Operating Systems Deploy OS patches & perform upgrades Configure filesystems & allocate storage Develop Unix scripts Develop scripts for automation of infrastructure provisioning. Monitor infrastructure and develop utilization reports Evaluate new AWS services and solutions Experience working with customers to diagnose a problem, and work toward resolution Excellent verbal and written communication skills Understanding of various Load Balancers in a large data center environment Location : Hyderabad,Gurugram,Chennai,Mumbai,Maharastra

Posted 1 month ago

Apply

10.0 - 15.0 years

7 - 10 Lacs

Mumbai

Work from Office

JOB OVERVIEW: Part of Business Focused IT, candidate needs would be in charge of scaling up and managing an Enterprise-Wide Data Platform that would support the analytical needs of the complete Pharma business(extensible to other businesses as required). The platform should be flexible to support business operations of the future and provide story telling type of intuitive analytics. This position would be a part of the Analytics Center of Excellence. KEY STAKEHOLDERS: INTERNAL: Corporate IT, Infra, Business Stakeholders, Site Teams KEY STAKEHOLDERS: EXTERNAL: Product Vendors, Consultants, Developers REPORTING STRUCTURE: Will report to Corporate CIO No of direct reports 2+ EXPERIENCE: 10+ years of relevant experience with business background Understanding of principles of data management Knowledge in data taxonomies Knowledge of data systems and processes Similar hands-on experience of setting up data platforms from scratch in In-depth understanding of Snowflake Captive analytics delivery team in a comparable business enterprise Experience in designing, building, and optimizing ETL pipelines using tools like Airflow, AWS Glue, PySpark. Experience with BI tools like Qlik, Power BI to design interactive dashboards and reporting solution. SKILLS & COMPETENCIES: Excellent Written and verbal communication Business case development Excellent inter-personal skills Team Player & team building capabilities Budgeting, forecasting & funds management Strong connects with OEMs and Cloud players in the space of next generation analytics Deep technical expertise in prominent ETL, DB, MDM, Analytical tool(s) Exposure to SQL Strong expertise in Snowflake data platform (data modeling, ETL processes, performance optimization) and AWS services (S3, Glue, Lambda and more). KEY ROLES & RESPONSIBILITIES: Responsible for developing and maintaining the global data marketplace (data lake) Manages the sourcing and acquisition of internal (including IT and OT) & external data sets Ensure adherence of data to both enterprise business rules, and, especially, to legal and regulatory requirements Define the data quality standards for cross functional data that is used in BI/analytics models/reports Provide input into data integration standards and the enterprise data architecture Responsible for modelling and designing the application data structure, storage and integration and leading the database analysis, design and build effort Review the database deliverables throughout development thereby ensuring quality and traceability to requirements and adherence to all quality management plans and standards Develop strategies for data acquisitions, dissemination and archival Manage the data architecture within the big data solution such as Hadoop, Cloudera, etc.. Responsible for modelling and designing the big data structure, storage, integration and leading the database analysis, design, visualization and build effort Review the database deliverables throughout development thereby ensuring quality and traceability to requirements and adherence to all quality management plans and standards Work with partners and vendors (in a multi-vendor environment) for various capabilities Continuously review the analytics stack for improvements performance improvements, reduce overall TCO through cost optimizations & better the predictive capabilities Bring in thought leadership with regards to analytics to make Piramal Pharma an analytics driven business; and help in driving business KPIs Preparation of Analytics Platform budgets for both CAPEX and OPEX for assigned initiatives and rolling out the initiatives within budget & projected timelines Drive MDM Strategy and Implementation initiative Responsible for overall delivery and customer satisfaction for Business services, interaction with business leads, project status management and reporting, implementation management, identifying further opportunities for automation within PPL Ensure IT compliance in all project rollouts as per regulatory guidelines Conduct Change Management and Impact Analysis for approved enhancements. Uphold data integrity requirements following ALCOA+ guidelines. Monitor SLAs and KPIs as agreed upon by the business, offering root-cause analysis and risk mitigation action plans when needed. Drive awareness & learning across Piramal Pharma in Enterprise Data Platform QUALIFICATION: Graduation from Engineering discipline Post Graduate preferable Certification/ Degree in Analytics(or related stream) would be an advantage Experience in Pharma domain would be an added advantage

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 14 Lacs

Pune

Work from Office

Role & responsibilities Mandatory skills* API, Java, Databricks and AWS Detailed JD *(Roles and Responsibilities) Technical Two or more years of API Development experience (specifically Rest APIs using Java, Spring boot, Hibernate) Two or more years of Data Engineering and the respective tools and technologies (e.g., Apache Spark, Databricks, SQL DB, NoSQL DB, Data Lake concepts) Working knowledge of Test-driven development Working knowledge of experience leveraging DevOps and lean development principles such as Continuous Integration, Continuous Delivery/Deployment using tools like Git Working knowledge of ETL, Data Modeling, Data Warehousing, and working with large-scale datasets Working Knowledge of AWS services such as Lambda, RDS, ECS, DynamoDB, API Gateway, S3 etc. Good to have: AWS Developer certification or working experience in AWS cloud or other cloud technologies Passionate, creative and have the desire to learn new complex technical areas Accountable, curious, and collaborative with an intense focus on product quality Skilled in interpersonal communications and ability to communicate complex topics to non-technical audiences Experience working in an agile team environment Interested candidate can share me there updated resume in recruiter.wtr26@walkingtree.in

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 1 Lacs

Gurgaon/Gurugram, Delhi / NCR

Work from Office

We are seeking a skilled Technical Support & DevOps Engineer to handle customer technical queries, manage integrations, oversee development projects, and maintain our AWS cloud infrastructure. The ideal candidate will also be responsible for team training and mentoring. This role requires a blend of technical expertise, problem-solving skills, and leadership abilities. Key Responsibilities: Customer Support & Integration: Resolve customer technical queries related to product functionality, APIs, and integrations. Assist customers with onboarding, troubleshooting, and optimizing product usage. Collaborate with the development team to escalate and resolve complex issues. Development Project Management: Oversee and coordinate software development projects, ensuring timely delivery. Work closely with cross-functional teams (Product, Engineering, QA) to align on project goals. Track progress, identify risks, and implement mitigation strategies. AWS Cloud Infrastructure Management: Deploy, monitor, and maintain scalable and secure AWS cloud services (EC2, S3, RDS, Lambda, etc.). Optimize cloud costs, performance, and reliability. Implement CI/CD pipelines and automation for seamless deployments. Team Training & Management: Mentor and train junior team members on technical processes and best practices. Foster a collaborative environment by conducting knowledge-sharing sessions. Assist in performance evaluations and skill development initiatives. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, or related field. 3+ years of experience in technical support, DevOps, or cloud infrastructure in a SAAS environment. Strong expertise in AWS services (EC2, S3, RDS, Lambda, CloudFront, IAM, etc.). Proficiency in scripting (Bash, Python, or similar). Knowledge of CI/CD pipelines and automation tools. Experience managing development projects using Agile/Scrum methodologies. Excellent problem-solving, communication, and leadership skills.

Posted 1 month ago

Apply

3.0 - 5.0 years

2 - 2 Lacs

Ahmedabad

Remote

Expected Notice Period : 30 Days Shift : (GMT+11:00) Australia/Melbourne (AEDT) Opportunity Type : Remote Placement Type : Full Time Indefinite Contract(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - Okkular) What do you need for this opportunity? Must have skills required: Communication Skills, problem-solvers, Agentic AI, AWS services (Lambda, FastAI, LangChain, Large Language Model (LLM), Natural Language Processing (NLP), PyTorch, Sagemaker, Step Functions), Go Lang, Python Okkular is Looking for: About the job Company Description: We are a leading provider of fashion e-commerce solutions, leveraging generative AI to empower teams with innovative tools for merchandising and product discovery. Our mission is to enhance every product page with engaging customer-centric narratives, propelling accelerated growth and revenue generation. Join us in shaping the future of online fashion retail through cutting-edge technology and unparalleled creativity within the Greater Melbourne Area. Role Description: This is a full-time remote working position in India as a Senior AI Engineer . The Senior AI Engineer will be responsible for pattern recognition, neural network development, software development, and natural language processing tasks on a daily basis. Qualifications: Proficiency in sklearn, PyTorch, and fastai for implementing algorithms and training/improving models. Familiarity with Docker, AWS cloud services like Lambda, SageMaker, Bedrock. Familiarity with Streamlit. Knowledge of LangChain, LlamaIndex, Ollama, OpenRouter, and other relevant technologies. Expertise in pattern recognition and neural networks. Experience in Agentic AI development. Strong background in Computer Science and Software Development. Knowledge of Natural Language Processing (NLP). Ability to work effectively in a fast-paced environment and collaborate with cross-functional teams. Strong problem-solving skills and attention to detail. Masters or PhD in Computer Science, AI, or a related field is preferred, but not mandatory. Strong experience in the field is sufficient alternative. Prior experience in fashion e-commerce is advantageous. Languages: Python, Golang

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 40 Lacs

Gurugram

Remote

Job Title: Senior Data Engineer Location: Remote Job Type: Fulltime YoE: 7 to 10 years relevant experience Shift: 6.30pm to 2.30am IST Job Purpose: The Senior Data Engineer designs, builds, and maintains scalable data pipelines and architectures to support the Denials AI workflow under the guidance of the Team Lead, Data Management. This role ensures data is reliable, compliant with HIPAA, and optimized. Duties & Responsibilities: Collaborate with the Team Lead and crossfunctional teams to gather and refine data requirements for Denials AI solutions. Design, implement, and optimize ETL/ELT pipelines using Python, Dagster, DBT, and AWS data services (Athena, Glue, SQS). Develop and maintain data models in PostgreSQL; write efficient SQL for querying and performance tuning. Monitor pipeline health and performance; troubleshoot data incidents and implement preventive measures. Enforce data quality and governance standards, including HIPAA compliance for PHI handling. Conduct code reviews, share best practices, and mentor junior data engineers. Automate deployment and monitoring tasks using infrastructure-as-code and AWS CloudWatch metrics and alarms. Document data workflows, schemas, and operational runbooks to support team knowledge transfer. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, or related field. 5+ years of handson experience building and operating productiongrade data pipelines. Solid experience with workflow orchestration tools (Dagster) and transformation frameworks (DBT) or other similar tools such (Microsoft SSIS, AWS Glue, Air Flow). Strong SQL skills on PostgreSQL for data modeling and query optimization or any other similar technologies (Microsoft SQL Server, Oracle, AWS RDS). Working knowledge with AWS data services: Athena, Glue, SQS, SNS, IAM, and CloudWatch. Basic proficiency in Python and Python data frameworks (Pandas, PySpark). Experience with version control (GitHub) and CI/CD for data projects. Familiarity with healthcare data standards and HIPAA compliance. Excellent problemsolving skills, attention to detail, and ability to work independently. Strong communication skills, with experience mentoring or leading small technical efforts.

Posted 1 month ago

Apply

0.0 - 4.0 years

2 - 4 Lacs

Pune

Work from Office

About the Job: We are looking for a highly motivated and skilled Software Engineer to join our dynamic team in Pune with 2+ years of experience in SQL databases, AWS data pipelines, and Python programming. The ideal candidate should understand API structures, push/pull APIs, API testing, and the basics of mobile application functionality. Role & responsibilities: Design, develop, and manage SQL databases, ensuring efficient data storage and retrieval. Work with AWS services such as EC2, RDS, Lambda, and AWS data pipelines for processing and managing data. Write basic Python scripts for data processing, automation, and integration. Understand API structures, including RESTful APIs, push-pull mechanisms, and how APIs interact with databases and applications. Work with mobile application teams to understand data flow and optimize API interactions. Monitor, debug, and optimize cloud-based data workflows and pipelines. Required Skills & Qualifications: SQL Database: Strong understanding of relational databases, query optimization, and data management. AWS Services: Experience with EC2, RDS, Lambda, and AWS data pipeline setup and management. Python Programming: Basic scripting skills for automation and data handling. API Knowledge: Understanding of API structures, push/pull API mechanisms, API testing, and integration. Mobile Application Basics: Knowledge of how mobile apps interact with back-end systems and APIs. Debugging & Testing: Ability to test, troubleshoot, and validate data flows, APIs, and system integrations. Problem-Solving: Analytical mindset to optimize workflows and data processes. Preferred Qualifications: Experience with cloud-based data architectures. Exposure to serverless computing and event-driven architectures. Basic understanding of front-end and back-end application development. Soft Skills Required Encompass problem-solving. Good Communication Teamwork. Ability to learn and adapt to new technologies. Note: To Apply: Interested candidates should submit their resume to hr@areete.ai or apply through the Naukri.com portal. Contact Person HR Suchita (9226243023)

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 16 Lacs

Mumbai, Bengaluru

Work from Office

Must Understand the Product life cycle (Requirement analysis/ design/architecture). Must Have hands on Coding Experience in J2EE technologies, Core Java, JavaScript, Web services SOAP, RESTful services Good experience on any one application server (WebLogic, WebSphere, Apache servers etc.). Database experience in any one DB like Oracle/ MySQL/MS SQL etc. Knowledge / Experience with any one open source development tools and frameworks (Eclipse, ANT, Junit, PMD, SVN etc.) Ability to understand UML design diagrams. Ideally should have designed some modules in prior projects using OOAD and UML. Extensive experience with multi-tier transactional architecture and design. Passionate about test driven, clean code, automatically tested and continuous deployment Good to have: Any prior Work or Knowledge on newer technologies like Big data cloud, Hadoop and spark) Any prior Work or Knowledge on NoSQL, ORM, AWS Services (EC2, SNS, SQS, S3, Cloudfront) Any prior Work or Knowledge/ MOOC on Machine Learning and AI. Experience in performance engineering domain (testing, tuning, profiling) will be a definite add-on. Expertise with identity management, authentication and information security best practices Expertise in working with complex cross-domain applications.

Posted 1 month ago

Apply

25.0 - 30.0 years

75 - 90 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description We're seeking a strong and creative Dev Ops bamboo (Atlassian Suite) eager to solve challenging problems of scale and work on cutting edge technologies. In this project, you will have the opportunity to write code that will impact thousands of users every month. You'll implement your critical thinking and technical skills to develop cutting edge software, and you'll have the opportunity to interact with teams across disciplines. In Luxoft, our culture is one that strives on solving difficult problems focusing on product engineering based on hypothesis testing to empower people to come up with ideas. In this new adventure, you will have the opportunity to collaborate with a world-class team in the field of Insurance by building a holistic solution, interacting with multidisciplinary teams. This project is to upgrade and modernise a Visual Basic/SQL DB application and 2 Lotus Notes applications (all are 25 year old applications and involves migration of 50+ million records of data with zero data loss out of LN) into modern digital cloud and hybrid microservices based applications running on native Kubernetes container platform. Responsibilities Support the migration of 40+ applications to the AWS Public Cloud Ensure that all application deployments are automated using DevOps processes Due to compliance restrictions, Azure DevOps cannot be used for Germany; therefore, we must use Zurich's existing Atlassian DevOps platform Application deployments must be executed via Bamboo The primary focus will be on the German Business Unit deploying applications across AWS services, including EC2 and ROSA (Red Hat OpenShift Service on AWS) Skills Must have 5+ Yrs Experience on Azure / AWS Services Hands-on experience with Bamboo for application deployment Experience deploying applications into AWS EC2 instances and ROSA containers is highly desirable

Posted 1 month ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

Pune

Hybrid

Design, implement, optimize ETL/ELT pipelines to ingest, transform, and load data into AWS Redshift from various sources Strong background in Python scripting, AWS services (Lambda, S3, Redshift),Data Integration & Pipeline Development Required Candidate profile 6 + years of exp. in BI development, data engineering. • Python/R scripting for data processing/ automation. • AWS services: Lambda, S3, and Redshift. • Data warehousing • Proficiency in SQL

Posted 1 month ago

Apply

2.0 - 6.0 years

1 - 4 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Work Timings: 2 PM to 11 PM Job Description AWS Services: Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, CloudWatch Logs, X-Ray, EventBridge, Amazon Pinpoint, Cognito, KMS, API Gateway, App Sync Infrastructure as Code (IaC): AWS CDK, CodePipeline (planned) Serverless Architecture & Event-Driven Design Cloud Monitoring & Observability: CloudWatch Logs, X-Ray, Custom Metrics Security & Compliance: IAM roles boundaries, PHI/PII tagging, Cognito, KMS, HIPAA standards, Isolation Pattern, Access Control Cost Optimization: S3 lifecycle policies, serverless tiers, service selection (e.g., Pinpoint vs SES) Scalability & Resilience: Auto-scaling, DLQs, retry/backoff, circuit breakers CI/CD Pipeline Concepts Documentation & Workflow Design Cross-Functional Collaboration and AWS Best Practices Skills: x-ray,cloud monitoring,aws,ci/cd pipeline,documentation,security & compliance,cost optimization,aws services,infrastructure,infrastructure as code (iac),cross-functional collaboration,observability,serverless architecture,api,event-driven design,scalability,workflow design,aws best practices,resilience Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

4.0 - 8.0 years

20 - 32 Lacs

Hyderabad, Gurugram

Work from Office

• Designing, developing & deploying cloud-based data platforms using (AWS) • Integrating & processing structured & unstructured data from various sources • Troubleshooting data platform issues Watsapp (ANUJ - 8249759636) for more details.

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Hyderabad

Remote

Job Title: Senior Backend Developer JavaScript & Node.js Location: Remote Job Type: Full-time Role: Individual Contributor Experience: Minimum Years Required 8+ Years Key Responsibilities: Develop, maintain, and optimize backend services using Node.js and JavaScript . Architect and deploy applications using AWS Lambda and Serverless Framework . Ensure efficient integration of AWS services such as Cognito, DynamoDB, RDS, ECS, ECR, EC2, IAM . Implement and manage containerized environments using Docker . Collaborate with cross-functional teams to ensure seamless application performance. Design and optimize database interactions, ensuring high availability and performance. Troubleshoot and resolve technical issues related to backend services. Implement best security practices forcloud-based applications. Required Skills & Experience: Strong expertise in Node.js & JavaScript . Deep understanding of AWS Lambda and Serverless Framework . Hands-on experience with Docker and container orchestration tools. Proven ability to work with AWS services (Cognito, DynamoDB, RDS, ECS, ECR, EC2, IAM). Strong knowledge of RESTful APIs and microservices architecture. Hands on in writing SQL Experience with CI/CD pipelines for efficient deployment. Ability to optimize backend performance and scalability. Solid understanding of security and compliance in cloud environments. Preferred Qualifications: Experience in monitoring and logging tools (AWS CloudWatch, AWS X-Ray). Familiarity with Terraform or Infrastructure-as-Code (IaC) concepts. Previous experience in high-traffic applications and scalable systems. Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 10 Lacs

Kolkata

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.

Posted 1 month ago

Apply

4.0 - 6.0 years

14 - 15 Lacs

Mumbai

Work from Office

4+yrs exp in cloud pre-sales, solutioning, or consulting Design scalable, secure&cost-effective AWS architectures to customer Prepare solution documents, RFP responses&effort Conduct proof of concepts (POCs) and demos to validate proposed solution Required Candidate profile Familiar DevOps practice&tools e.g.Terraform, Jenkins, Git Understanding of AWS services like EC2, S3, RDS, Lambda, VPC, IAM, Cloud Formation Exp in cloud migration projects/hybrid cloud environment

Posted 1 month ago

Apply

9.0 - 14.0 years

30 - 37 Lacs

Hyderabad

Hybrid

Primary Responsibilities: Design, implement, and maintain scalable, reliable, and secure infrastructure on AWS and EKS Develop and manage observability and monitoring solutions using Datadog, Splunk, and Kibana Collaborate with development teams to ensure high availability and performance of microservices-based applications Automate infrastructure provisioning, deployment, and monitoring using Infrastructure as Code (IaC) and CI/CD pipelines Build and maintain GitHub Actions workflows for continuous integration and deployment Troubleshoot production issues and lead root cause analysis to improve system reliability Ensure compliance with healthcare data standards and regulations (e.g., HIPAA, HL7, FHIR) Work closely with data engineering and analytics teams to support healthcare data pipelines and analytics platforms Mentor junior engineers and contribute to SRE best practices and culture Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelors degree in Engineering (B.Tech) or equivalent in Computer Science, Information Technology, or a related field 10+ years of experience in Site Reliability Engineering, DevOps, or related roles Hands-on experience with AWS services, EKS, and container orchestration Experience with healthcare technology solutions, health data interoperability standards (FHIR, HL7), and healthcare analytics Experience with GitHub Actions or similar CI/CD tools Solid expertise in Datadog, Splunk, Kibana, and other observability tools Deep understanding of microservices architecture and distributed systems Proficiency in Python for scripting and automation Solid scripting and automation skills (e.g., Bash, Terraform, Ansible) Proven excellent problem-solving, communication, and collaboration skills Preferred Qualifications: Certifications in AWS, Kubernetes, or healthcare IT (e.g., AWS Certified DevOps Engineer, Certified Kubernetes Administrator) Experience with security and compliance in healthcare environments

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Designation: Java Developer Experience: 5+ Years Location : Bangalore/Mumbai NP : Immediate Joiners JD: Please find below the JD for the position. Proven experience in Java development, with a strong understanding of object-oriented programming principles. Experience with AWS services, including ECS, S3, RDS, Elasticache and CloudFormation. Experience with microservices architecture and RESTful API design. Strong problem-solving skills and attention to detail. Experience in the financial services industry, particularly in trading or risk management, is a plus. Excellent communication and collaboration skills.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Designation : Python + AWS Experience : 5+ Years Work Location : Bangalore / Mumbai Notice Period: Immediate Joiners/ Serving Notice Period Job Description : Mandatory Skills: Python Data structures pandas, numpy Data Operations - DataFrames, Dict, JSON, Lists, Tuples, Strings Oops & APIs(Flask/FastAPI) AWS services(IAM, EC2, Lambda, S3, DynamoDB, etc) Sincerely, Sonia TS

Posted 1 month ago

Apply

6.0 - 11.0 years

9 - 19 Lacs

Hyderabad

Work from Office

Location: Hyderabad (Preferred) / Bangalore (If strong candidates are available) Type: Contractual Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad

Work from Office

What you will do Role Description: We are seeking a Senior Data Engineer with expertise in Graph Data technologies to join our data engineering team and contribute to the development of scalable, high-performance data pipelines and advanced data models that power next-generation applications and analytics. This role combines core data engineering skills with specialized knowledge in graph data structures, graph databases, and relationship-centric data modeling, enabling the organization to leverage connected data for deep insights, pattern detection, and advanced analytics use cases. The ideal candidate will have a strong background in data architecture, big data processing, and Graph technologies and will work closely with data scientists, analysts, architects, and business stakeholders to design and deliver graph-based data engineering solutions. Roles & Responsibilities: Design, build, and maintain robust data pipelines using Databricks (Spark, Delta Lake, PySpark) for complex graph data processing workflows. Own the implementation of graph-based data models, capturing complex relationships and hierarchies across domains. Build and optimize Graph Databases such as Stardog, Neo4j, Marklogic or similar to support query performance, scalability, and reliability. Implement graph query logic using SPARQL, Cypher, Gremlin, or GSQL, depending on platform requirements. Collaborate with data architects to integrate graph data with existing data lakes, warehouses, and lakehouse architectures. Work closely with data scientists and analysts to enable graph analytics, link analysis, recommendation systems, and fraud detection use cases. Develop metadata-driven pipelines and lineage tracking for graph and relational data processing. Ensure data quality, governance, and security standards are met across all graph data initiatives. Mentor junior engineers and contribute to data engineering best practices, especially around graph-centric patterns and technologies. Stay up to date with the latest developments in graph technology, graph ML, and network analytics. What we expect of you Must-Have Skills: Hands-on experience in Databricks, including PySpark, Delta Lake, and notebook-based development. Hands-on experience with graph database platforms such as Stardog, Neo4j, Marklogic etc. Strong understanding of graph theory, graph modeling, and traversal algorithms Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies with strong problem-solving and analytical skills Excellent collaboration and communication skills, with experience working with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Masters degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelors degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 month ago

Apply

2.0 - 6.0 years

3 - 6 Lacs

Chennai

Remote

Experience: 1.5 to 5 years Location: Remote Employment Type: Full-Time Job Summary: We are looking for talented Full Stack Python Developers (Junior and Senior levels) who are passionate about building scalable web applications. You will work closely with cross-functional teams to design, develop, and deliver robust enterprise solutions using modern technologies such as Python, ReactJS, AWS, and more. Responsibilities : Design, develop, test, deploy, and maintain scalable enterprise web applications. Build responsive front-end applications using ReactJS . Develop robust backend services and RESTful APIs using Python (Django / Flask / FastAPI). Work on Microservices architecture and cloud-based platforms such as AWS . Utilize Docker and Terraform for DevOps activities and infrastructure management. Participate in code reviews and Agile Scrum practices. (Senior Role) Architect solutions and ensure adherence to coding standards. (Senior Role) Mentor junior developers and contribute to technical leadership. Requirements : For Junior Developer (2 to 3 years): 2 to 3 years of experience with Python (Django / Flask / FastAPI). Experience contributing to Microservices architecture. Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Familiarity with Postgres , DynamoDB , SQL queries. Exposure to AWS , Docker , Terraform is a plus. Strong problem-solving and collaboration skills. Eagerness to learn and work in a fast-paced environment. For Senior Developer (3 to 5 years): 3 to 5 years of hands-on experience with Python (Django / Flask / FastAPI). Proven experience building and architecting Microservices . Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Strong experience with Postgres , DynamoDB , SQL queries. Hands-on experience with Terraform , Docker , AWS services. Familiarity with AWS S3 , ElasticSearch is a plus. Strong problem-solving, leadership, and communication skills. Ability to mentor junior team members and drive best practices.

Posted 1 month ago

Apply

6.0 - 9.0 years

14 - 24 Lacs

Kochi

Work from Office

Greetings from Cognizant!! #MegaWalkIn We have an exciting opportunity for the #AWS Services Role with Cognizant, join us if you are an aspirant for matching the below criteria!! Primary Skill: AWS Services Experience: 6-9 years Job Location: PAN India Interview Day: 14 Jun 2025 - Saturday Interview Location: Kochi Interview mode: Walk-in Drive ( Cognizant Technology Solutions, Infopark Phase 2, Kakkanad, Kochi, Kerala 682030 ) Interested Candidates, Apply here >> https://forms.office.com/r/t2X9WiRS9T Regards, Vinosha TAG-HR

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune

Work from Office

About the company: Our esteemed client is a leading global systems integrator and business transformation consulting organisation. Our client helps companies innovate and transform by leveraging its unique insights, differentiated services, and flexible partnering models. The company is among the Top mobile application development companies in India, and also a pioneer in web application development and Automation Testing. They have gained the trust of more than 300 offshore clients from 30+ countries worldwide and has become a trustworthy software partner. Job Overview: We are seeking a skilled Senior Python Developer with extensive experience in Python development and hands-on expertise in AWS cloud services. The ideal candidate will play a crucial role in developing, maintaining, and deploying backend services and cloud infrastructure. This position is primarily focused on Python development, complemented by AWS tasks. What you will do: Python Development : Design, develop, test, and deploy scalable and efficient backend solutions using Python. Write clean, maintainable, and well-documented code following best practices. Implement APIs, data processing workflows, and integrations with various services. Troubleshoot, debug, and optimize existing Python applications for performance and scalability. Collaborate with cross-functional teams, including frontend developers, QA engineers, and product managers, to deliver high-quality software. Conduct code reviews and mentor junior developers on best practices. AWS Cloud Management : Design, configure, and maintain AWS infrastructure components such as EC2, S3, Lambda, RDS, and CloudFormation. Implement and manage CI/CD pipelines using AWS services like CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize cloud resource usage, ensuring cost-effective and reliable cloud operations. Set up security best practices on AWS, including IAM roles, VPC configurations, and data encryption. Troubleshoot cloud-related issues, perform regular updates, and apply patches as needed. What you will bring to the table: Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience in Python development. 3+ years of experience working with AWS services including EC2, S3, Lambda, RDS, CloudFormation, and Secrets Manager. Experience with modern AWS tools and services including API Gateway, DynamoDB, ECS, Amplify, CloudFront, Shield, OpenSearch (ElasticSearch). Strong knowledge of serverless architecture and deployment using the Serverless Framework. Proficiency in web frameworks such as Django, Flask, or FastAPI. Strong understanding of RESTful API design and integration via API Gateways. Solid experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB). Familiarity with DevOps practices and CI/CD tools. Experience with version control systems, particularly Git. Strong problem-solving skills and attention to detail. Excellent communication skills and ability to work in a team environment. Perks and benefits As per industry standard

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies