Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The company Infiniti Research is looking for an AWS DevOps Engineer to join their team in Bangalore. The ideal candidate should have 1.5 to 3 years of experience and be proficient in AWS cloud technologies, Docker, scripting, and automation. The role will involve managing cloud infrastructure, developing CI/CD pipelines, containerizing applications with Docker, scripting for automation, monitoring system performance, collaborating with development teams, and maintaining documentation. Key Responsibilities - Cloud Infrastructure Management: Design, implement, and maintain scalable and secure AWS infrastructure. - CI/CD Pipeline Development: Develop and manage pipelines to automate build, test, and deployment processes. - Containerization: Use Docker to containerize applications across different environments. - Scripting and Automation: Write and maintain Bash scripts for automation and system management. - Monitoring and Performance Optimization: Monitor system performance, troubleshoot issues, and implement optimizations. - Collaboration: Work closely with development teams to integrate workflows into the CI/CD process. - Documentation: Maintain clear and comprehensive documentation of infrastructure, processes, and configurations. Required Skills and Qualifications - AWS Expertise: Strong experience with AWS services and architecture best practices. - Docker Proficiency: Hands-on experience with Docker and Docker files. - Bash Scripting: Proven experience in writing and executing Bash scripts. - Python Knowledge: Familiarity with Python for scripting tasks is a plus. - AWS Certification: AWS certification preferred. - CI/CD Experience: Experience with CI/CD tools such as Azure Devops, Azure Pipelines, and AWS CodePipeline. Education and Experience - Bachelors degree in computer science, Engineering, or a related field. If you have any queries, please contact ramyasrikarthika@infinitiresearch.com or visit www.infinitiresearch.com.,
Posted 3 days ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Ahmedabad
Work from Office
As a Senior platform engineer, you are expected to design and develop key components that power our platform. You will be building a secure, scalable, and highly performant distributed platform that connects multiple cloud platforms like AWS, Azure, and GCP. Job Title: Sr. Platform Engineer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Responsibilities: Take full ownership of developing, maintaining, and enhancing specific modules of our cloud management platform, ensuring they meet our standards for scalability, efficiency, and reliability. Design and implement serverless applications and event-driven systems that integrate seamlessly with AWS services, driving the platform's innovation forward. Work closely with cross-functional teams to conceptualize, design, and implement advanced features and functionalities that align with our business goals. Utilize your deep expertise in cloud architecture and software development to provide technical guidance and best practices to the engineering team, enhancing the platform's capabilities. Stay ahead of the curve by researching and applying the latest trends and technologies in the cloud industry, incorporating these insights into the development of our platform. Solve complex technical issues, providing advanced support and guidance to both internal teams and external stakeholders. Requirements: A minimum of 5 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. Proven expertise in serverless development and event-driven architecture design, with a track record of developing and shipping high-quality SaaS platforms on AWS. Comprehensive understanding of cloud computing concepts, architectural best practices, and AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Solid knowledge of object-oriented programming (OOP), SOLID principles, and experience with relational and NoSQL databases. Proficiency in developing and integrating RESTful APIs and familiarity with source control systems like Git. Exceptional problem-solving skills, capable of optimizing complex systems. Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. A strong drive for continuous learning and staying updated with industry developments. Nice to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes. Experience: 5 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. 1+ years of experience working on applications built using Serverless architecture. 1+ years of hands-on experience with Microservices Architecture in live projects. 1+ years of experience applying Domain-Driven Design principles in projects. 1+ years of experience working with Event-Driven Architecture in real-world applications. 1+ years of experience integrating, consuming, and maintaining AWS services. 1+ years of experience working with Boto3 in Python.
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description : As a Senior Software Engineer - AWS Python at Incedo, you will be responsible for developing and maintaining applications on the Amazon Web Services (AWS) platform. You will be expected to have a strong understanding of Python and AWS technologies, including EC2, S3, RDS, and Lambda. Roles & Responsibilities: Writing high quality code, participating in code reviews, designing systems of varying complexity and scope, and creating high quality documents substantiating the architecture. Engaging with clients, understanding their technical requirements, planning and liaising with other team members to develop technical design & approach to deliver end-to-end solutions. Mentor & guide junior team members, review their code, establish quality gates, build & deploy code using CI/CD pipelines, apply secure coding practices, adopt unit-testing frameworks, provide better coverage, etc. Responsible for teams growth. Technical Skills : Must Have : Python, FAST API, Uvicorn,SQLAlchemy,boto3,Lamdba server less, pymysql Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Proficiency in Python programming language Experience in developing and deploying applications on AWS Knowledge of serverless computing and AWS Lambda Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
5.0 - 8.0 years
8 - 12 Lacs
Noida
Work from Office
Automation test engineer with experience using AWS and scripting in Python Knowledge of Boto3 framework is required Test engineer should be able to test Infrastructure provisioned using CDK (created and deleted) and also test the full pipeline; scripts to test the persona (role Experience Required: 5 - 8 Yrs Involves execution of testing, monitoring and operational activities of various complexity based on assigned portfolio ensuring adherences to established service levels and standards. Executes identified test programs for a variety of specializations to support effective testing & monitoring of controls within business groups and across the Bank. Understands the business/group strategy and develops and maintains knowledge of end to end processes. Executes testing activities and any other operational activities within required service level agreements or standards. Develops knowledge related to program and/or area of specialty. Develops and maintains effective relationships with internal & external business partners/stakeholders to execute work and fulfill service delivery expectations. Participates in planning and implementation of operational programs and executes within required service level agreements and standards. Supports change management of varying scope and type; tasks typically focused on execution and sustainment activities. Executes various operational activities/requirements to ensure timely, accurate, and efficient service delivery. Ensures consistent, high quality practices/work and the achievement of business results in alignment with business/group strategies and with productivity goals. Analyzes automated test results and provides initial feedback on test results. Analyzes root causes of any errors discovered to provide for effective communication of issues to appropriate parties. Develops insights and recommends continuous improvement insights based on test results. Creates and maintains adequate monitoring support documentation, such as narratives, flowcharts, process flows, testing summaries, etc. to support the results of the reviews, including the write up of findings/issues for reporting. Mandatory Competencies QE - Test Automation Preparation Beh - Communication QA/QE - QA Automation - Python Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate
Posted 1 week ago
3.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
3.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant- Generative AI - Application development Senior Developer We are looking for a Senior Application Developer to join our product engineering team. This role requires hands-on experience in designing and developing scalable application components with a strong focus on API development, middleware orchestration, and data transformation workflows. You will be responsible for building foundational components that integrate data pipelines, orchestration layers, and user interfaces, enabling next-gen digital and AI-powered experiences. Key Responsibilities: Design, develop, and manage robust APIs and middleware services using Python frameworks like FastAPI and Uvicorn , ensuring scalable and secure access to platform capabilities. Develop end-to-end data transformation workflows and pipelines using LangChain , spacy , tiktoken , presidio-analyzer, and llm -guard, enabling intelligent content and data processing. Implement integration layers and orchestration logic for seamless communication between data sources, services, and UI using technologies like OpenSearch, boto3, requests-aws4auth, and urllib3. Work closely with UI/UX teams to integrate APIs into modern front-end frameworks such as ReactJS, Redux Toolkit, and Material UI. Build configurable modules for ingestion, processing, and output using Python libraries like PyMuPDF , openpyxl , and Unidecode for handling structured and unstructured data. Implement best practices for API security, data privacy, and anonymization using tools like presidio-anonymizer and llm -guard. Drive continuous improvement in performance, scalability, and reliability of the platform architecture. Qualifications we seek in you: Minimum Qualifications Experience in software development in enterprise/ web applications Languages & Frameworks: Python, JavaScript/TypeScript, FastAPI , ReactJS, Redux Toolkit Libraries & Tools: langchain , presidio-analyzer, PyMuPDF , spacy, rake- nltk , inflection, openpyxl , tiktoken APIs & Integration: FastAPI , requests, urllib3, boto3, opensearch-py , requests-aws4auth UI/UX: ReactJS, Material UI, LESS Cloud & DevOps: AWS SDKs, API gateways, logging, and monitoring frameworks (optional experience with serverless is a plus) Preferred Qualifications: Strong understanding of API lifecycle management, REST principles, and microservices. Experience in data transformation, document processing, and middleware architecture. Exposure to AI/ML or Generative AI workflows using LangChain or OpenAI APIs. Prior experience working on secure and compliant systems involving user data. Experience in CI/CD pipelines, containerization (Docker), and cloud-native deployments (AWS preferred). Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
At Velsera, we are committed to revolutionizing the pace of medicine. Established in 2023 by the collaboration of Seven Bridges and Pierian, our primary goal is to expedite the discovery, development, and dissemination of groundbreaking insights that can change lives for the better. We specialize in offering cutting-edge software solutions and professional services that cater to various aspects of the healthcare industry, including: - AI-powered multimodal data harmonization and analytics for drug discovery and development - IVD development, validation, and regulatory approval - Clinical NGS interpretation, reporting, and adoption Headquartered in Boston, MA, we are in a phase of rapid growth, with teams expanding across different countries to meet the increasing demands of our clients. As a Python Developer at Velsera, your responsibilities will include: - Development: Crafting clean, efficient, and well-documented Python code to fulfill project requirements - API Development: Creating RESTful APIs and integrating third-party APIs when necessary - Testing: Composing unit tests and integration tests to ensure high code quality and functionality - Collaboration: Collaborating closely with cross-functional teams to implement new features and enhance existing ones - Code Review: Participating in peer code reviews and offering constructive feedback to team members - Maintenance: Debugging, troubleshooting, and enhancing the existing codebase to boost performance and scalability. Proactively identifying technical debt items and proposing solutions to address them - Documentation: Maintaining detailed and accurate documentation for code, processes, and design - Continuous Improvement: Staying updated with the latest Python libraries, frameworks, and industry best practices. To excel in this role, you should bring: - Experience: A minimum of 6 years of hands-on experience in Python development - Technical Skills: Proficiency in Python 3.x, familiarity with popular Python libraries (e.g., NumPy, pandas, Flask, boto3), experience in developing lambda functions, strong understanding of RESTful web services and APIs, familiarity with relational databases (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB), knowledge of version control systems (e.g., Git), experience with Docker and containerization, experience with AWS services such as ECR, Batch jobs, step functions, cloud watch, etc., and experience with Jenkins is a plus - Problem-Solving Skills: Strong analytical and debugging skills with the ability to troubleshoot complex issues - Soft Skills: Strong written and verbal communication skills, ability to work independently and collaboratively in a team environment, detail-oriented with the capacity to manage multiple tasks and priorities. Preferred skills include experience working in the healthcare or life sciences domain, strong understanding of application security and OWASP best practices, hands-on experience with serverless architectures (e.g., AWS Lambda), proven experience in mentoring junior developers and conducting code reviews. Velsera offers a range of benefits, including a Flexible & Hybrid Work Model to support work-life balance and an Engaging & Fun Work Culture that includes vibrant workplace events, celebrations, and engaging activities to make every workday enjoyable.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
You have an exciting opportunity to join our team as a Cloud Infrastructure Engineer with a focus on AWS CDK and expertise in Python or TypeScript. In this role, you will be responsible for developing scalable and secure cloud infrastructure components that support modern applications. Whether you excel at scripting with Python or creating CDK constructs with TypeScript, we are looking for individuals who are passionate about infrastructure automation and software engineering. As a Cloud Infrastructure Engineer, your primary responsibilities will include designing, building, and maintaining AWS infrastructure using AWS CDK in Python or TypeScript. You will also be tasked with developing reusable CDK constructs to model various components such as VPCs, Lambda functions, EC2 instances, IAM policies, and more. Additionally, you will automate deployments using CDK CLI, manage dependencies and environments, implement tests for infrastructure code, troubleshoot deployment issues, and collaborate closely with DevOps and Architecture teams to ensure secure and scalable cloud solutions. To succeed in this role, you should possess at least 4+ years of DevOps experience with a strong background in AWS, along with 2+ years of experience working with AWS CDK in Python or TypeScript. Deep knowledge of AWS services such as EC2, Lambda, VPC, S3, IAM, and Security Groups is essential. Experience with AWS CLI, Boto3 (for Python), or Node.js/npm (for TypeScript) is also required. Familiarity with infrastructure test frameworks, CI/CD processes, and troubleshooting CloudFormation templates is a plus. If you have experience with Docker and Kubernetes, exposure to Terraform or multi-IaC environments, and a strong understanding of AWS security, scalability, and cost optimization practices, it would be considered a nice-to-have for this role. Join us in this dynamic opportunity where you can contribute to building cutting-edge cloud infrastructure and be part of a team that values collaboration, innovation, and excellence in cloud solutions.,
Posted 2 weeks ago
0.0 - 5.0 years
4 - 9 Lacs
Chennai
Remote
Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Required Candidate profile Knowledge of Python and related frameworks including Django and Flask. A deep understanding and multi-process architecture and the threading limitations of Python. Perks and benefits Flexible Work Arrangements.
Posted 2 weeks ago
5.0 - 8.0 years
14 - 22 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Hiring For Top IT Company- Designation: Python Developer Skills: Python + Pyspark Location :Bang/Mumbai Exp: 5-8 yrs Best CTC 9783460933 9549198246 9982845569 7665831761 6377522517 7240017049 Team Converse
Posted 2 weeks ago
4.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Role: DevOps/SRE Engineer with Python We are looking for a talented and experienced DevOps/Site Reliability Engineer (SRE) with a strong proficiency in Python to join our team at Cloud Raptor. The ideal candidate will be responsible for optimizing our company's production environment and ensuring the reliability and stability of our systems. Key Responsibilities: 1. Collaborate with development teams to design, develop, and maintain infrastructure for our highly available and scalable applications. 2. Automate processes using Python scripting to streamline the deployment and monitoring of our applications. 3. Monitor and manage cloud infrastructure on AWS, including EC2, S3, RDS, and Lambda. 4. Implement and manage CI/CD pipelines for automated testing and deployment of applications. 5. Troubleshoot and resolve production issues, ensuring high availability and performance of our systems. 6. Collaborate with cross-functional teams to ensure security, scalability, and reliability of our infrastructure. 7. Develop and maintain documentation for system configurations, processes, and procedures. Key Requirements: 1. Bachelor's degree in Computer Science, Engineering, or a related field. 2. 3+ years of experience in a DevOps/SRE role, with a strong focus on automation and infrastructure as code. 3. Proficiency in Python scripting for automation and infrastructure management. 4. Hands-on experience with containerization technologies such as Docker and Kubernetes. 5. Strong knowledge of cloud platforms such as AWS, including infrastructure provisioning and management. 6. Experience with monitoring and logging tools such as Prometheus, Grafana, and ELK stack. 7. Knowledge of CI/CD tools like Jenkins or Github Actions. 8. Familiarity with configuration management tools such as Ansible, Puppet, or Chef. 9. Strong problem-solving and troubleshooting skills, with an ability to work in a fast-paced and dynamic environment. 10. Excellent communication and collaboration skills to work effectively with cross-functional teams.
Posted 3 weeks ago
10.0 - 15.0 years
12 - 16 Lacs
Hyderabad
Work from Office
JD for Data Engineering Lead - Python: Data Engineering Lead with at least 7 to 10 years experience in Python with following AWS Services AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3, API Gateway, and CloudWatch. Providing architectural guidance to the offshore team,7-10, reviewing code and troubleshoot errors. Very strong SQL knowledge is a must, should be able to understand & build complex queries. Familiar with Gitlab( repos and CI/CD pipelines). He/she should be closely working with Virtusa onshore team as well as enterprise architect & other client teams at onsite as needed. Experience in API development using Python is a plus. Experience in building MDM solution is a plus.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
JD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Candidate should have worked on LABMDA based APIs development. Should have experience in using following AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely working with enterprise architect & other client teams at onsite as needed. Having experience in building solutions using Kafka would be good value addition(optional).
Posted 1 month ago
10.0 - 15.0 years
6 - 14 Lacs
Bengaluru
Work from Office
AsyncIO, FastAPI,Tornado,Flask,gRPC,Netmiko,Boto3,Pandas,Celery,Pytest
Posted 1 month ago
4.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills
Posted 1 month ago
5.0 - 7.0 years
27 - 30 Lacs
Hyderabad, Chennai
Work from Office
Experience required: 7+ years Core Generative AI & LLM Skills: * 5+ years in Software Engineering, 1+ year in Generative AI. * Strong understanding of LLMs, prompt engineering, and RAG. * Experience with multi-agent system design (planning, delegation, feedback). * Hands-on with LangChain (tools, memory, callbacks) and LangGraph (multi-agent orchestration). * Proficient in using vector DBs (OpenSearch, Pinecone, FAISS, Weaviate). * Skilled in Amazon Bedrock and integrating LLMs like Claude, Titan, Llama. * Strong Python (LangChain, LangGraph, FastAPI, boto3). * Experience building MCP servers/tools. * Designed robust APIs, integrated external tools with agents. * AWS proficiency: Lambda, API Gateway, DynamoDB, S3, Neptune, Bedrock Agents * Knowledge of data privacy, output filtering, audit logging * Familiar with AWS IAM, VPCs, and KMS encryption Desired Skills: * Integration with Confluence, CRMs, knowledge bases, etc. * Observability with Langfuse, OpenTelemetry, Prompt Catalog * Understanding of model alignment & bias mitigation
Posted 1 month ago
6.0 - 9.0 years
14 - 22 Lacs
Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse
Posted 1 month ago
2.0 - 6.0 years
2 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Roles and Responsibilities: To do so, the engineer is expected to build and support solutions that pre-process the paper image claims to extract data, build pipelines using serverless solutions and invoke AI/ML processes to populate claim data from the submitted claims. The engineer will also be working on building metrics, monitoring and operational dashboards. Required Skills: Strong hands-on experience with Python, Boto3, and test-driven development techniques such as unit testing and gameday testing. Hands-on experience in writing unit tests with Python. Hands-on experience with common AWS Services such as Lambdas, Step Functions, DynamoDB, S3, and Cloud Watch. Experience in deploying applications to development and test environments. Enters an existing team and learns rapidly about the overall goals of the solution. Collaborates with the rest of the team to explore paths towards the overall goals. Participate in peer reviews anddeployments. Executes,understands that work is not complete until it is implemented. When analysis is complete and decisions have been made, the work has only just begun. Embraces an agile mindset to adjust to best achieve the overall goals; is not locked into initial decisions. At the same time, develops plans in advance to find a healthy balance of preparedness and flexibility, as appropriate for each situation s needs. Rapidly raises up defects, and reflects on where prior judgment was incorrect in the spirit of growth. Good news travels fast, bad news faster. Addresses the mistakes of others in the spirit of learning and growth. Models these behaviors in the team retrospective.
Posted 1 month ago
8.0 - 13.0 years
9 - 14 Lacs
Bengaluru
Work from Office
8+ years experience combined between backend and data platform engineering roles Worked on large scale distributed systems. 5+ years of experience building data platform with (one of) Apache Spark, Flink or with similar frameworks. 7+ years of experience programming with Java Experience building large scale data/event pipelines Experience with relational SQL and NoSQL databases, including Postgres/MySQL, Cassandra, MongoDB Demonstrated experience with EKS, EMR, S3, IAM, KDA, Athena, Lambda, Networking, elastic cache and other AWS services.
Posted 1 month ago
8.0 - 11.0 years
7 - 11 Lacs
Hyderabad
Work from Office
HIH - Software Engineering Associate Advisor Position Overview The successful candidate will be a member of our US medical Integration Solutions ETL team. They will play a major role in the design and development if the ETL application in support of various portfolio projects. Responsibilities Analyze business requirements and translate into ETL architecture and data rules Serve as advisor and subject matter expert on project teams Manage both employees and consultants on multiple ETL projects. Oversee and review all design and coding from developers to ensure they follow company standards and best practices, as well as architectural direction Assist in data analysis and metadata management Test planning and execution Effectively operate within a team of technical and business professionals Asses new talent and mentor direct reports on best practices Review all designs and code from developers Qualifications Desired Skills & Experience: 8 - 11 Years of Experience in Java and Python, PySpark to support new development as well as support existing 7+ Years of Experience with Cloud technologies, specifically AWS Experience in AWS services such as Lambda, Glue, s3, MWAA, API Gateway and Route53, DynamoDB, RDS MySQL, SQS, CloudWatch, Secrete Manager, KMS, IAM, EC2 and Auto Scaling Group, VPC and Security Groups Experience with Boto3, Pandas and Terraforms for building Infrastructure as a Code Experience with IBM Datastage ETL tool Experience with CD /CI methodologies and processing and the development of these processes DevOps experience Knowledge in writing SQL Data mappingsource to target target to multiple formats Experience in the development of data extraction and load processes in a parallel framework Understanding of normalized and de-normalized data repositories Ability to define ETL standards & processes SQL Standards / Processes / Tools: Mapping of data sources ETL Development, monitoring, reporting and metrics Focus on data quality Experience with DB2/ZOS, Oracle, SQL Server, Teradata and other database environments Unix experience Excellent problem solving and organizational skills Strong teamwork and interpersonal skills and ability to communicate with all management levels Leads others toward technical accomplishments and collaborative project team efforts Very strong communication skills, both verbal and written, including technical writing Strong analytical and conceptual skills Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 month ago
4.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
The Cigna International Health unit uses Amazon Web Services (AWS) services and custom, proprietary solutions implemented in AWS, to pre-process paper health care claims from around the world. The volume is expected to reach significantly higher per day, as expansion of the initiative is one of the top priorities for International Health. To do so, the engineer is expected to build and support solutions that pre-process the paper image claims to extract data, build pipelines using serverless solutions and invoke AI/ML processes to populate claim data from the submitted claims. The engineer will also be working on building metrics, monitoring and operational dashboards. Required Skills: Strong hands-on experience with Python, Boto3, and test-driven development techniques such as unit testing and gameday testing. Hands-on experience in writing unit tests with Python. Hands-on experience with common AWS Services such as Lambdas, Step Functions, DynamoDB, S3, and CloudWatch. Experience in deploying applications to development and test environments. Enters an existing team and learns rapidly about the overall goals of the solution. Collaborates with the rest of the team to explore paths towards the overall goals. Participate in peer reviews and deployments.Executes, understands that work is not complete until it is implemented. When analysis is complete and decisions have been made, the work has only just begun. Embraces an agile mindset to adjust to best achieve the overall goals; is not locked into initial decisions. At the same time, develops plans in advance to find a healthy balance of preparedness and flexibility, as appropriate for each situation’s needs. Rapidly raises up defects, and reflects on where prior judgment was incorrect in the spirit of growth. Good news travels fast, bad news faster. Addresses the mistakes of others in the spirit of learning and growth. Models these behaviors in the team retrospective. About The Cigna Group Cigna Healthcare, a division of The Cigna Group, is an advocate for better health through every stage of life. We guide our customers through the health care system, empowering them with the information and insight they need to make the best choices for improving their health and vitality. Join us in driving growth and improving lives.
Posted 2 months ago
1.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Minimum Qualifications:- BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields- 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis- Strong understanding of how data can be stored and accessed in different structures- Experience with writing computer programs to solve problems- Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations- Ability to write SQL code and familiarity with R/Python, Linux shell commands- Be willing and able to quickly learn about new businesses, database technologies and analysis techniques- Ability to tell a good story and support it with numbers and visuals- Strong oral and written communication Preferred Qualifications:- Experience working with large datasets- Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3)- Experience building analytics applications leveraging R, Python, Tableau, Looker or other- Experience in geo-spatial analysis with POSTGIS, QGIS
Posted 2 months ago
4.0 - 9.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Minimum Qualifications: - BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields - 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis - Strong understanding of how data can be stored and accessed in different structures - Experience with writing computer programs to solve problems - Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations - Ability to write SQL code and familiarity with R/Python, Linux shell commands - Be willing and able to quickly learn about new businesses, database technologies and analysis techniques - Ability to tell a good story and support it with numbers and visuals - Strong oral and written communication Preferred Qualifications: - Experience working with large datasets - Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) - Experience building analytics applications leveraging R, Python, Tableau, Looker or other - Experience in geo-spatial analysis with POSTGIS, QGIS Apply Save Save Pro Insights
Posted 2 months ago
3.0 - 8.0 years
9 - 18 Lacs
Hyderabad
Hybrid
Data Engineer with Python development experience Experience: 3+ Years Mode: Hybrid (2-3 days/week) Location: Hyderabad Key Responsibilities Develop, test, and deploy data processing pipelines using AWS Serverless technologies such as AWS Lambda, Step Functions, DynamoDB, and S3. Implement ETL processes to transform and process structured and unstructured data eiciently. Collaborate with business analysts and other developers to understand requirements and deliver solutions that meet business needs. Write clean, maintainable, and well-documented code following best practices. Monitor and optimize the performance and cost of serverless applications. Ensure high availability and reliability of the pipeline through proper design and error handling mechanisms. Troubleshoot and debug issues in serverless applications and data workows. Stay up-to-date with emerging technologies in the AWS and serverless ecosystem to recommend improvements. Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workows to process and transform large datasets. Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Prociency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualications AWS certications (e.g., AWS Certied Developer Associate or AWS Certied Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools . Apply for Position
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough