Jobs
Interviews

286 Rds Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

6 - 14 Lacs

Hyderabad

Work from Office

Role: AWS Sysops Engineer Total experience Req: 4-8 Years Mandatory skills: AWS, Linux Location: Hyderabad (WFO 5 days) Notice Period: Immediate to 15 days maximum. Interested please reach me out siva.avula@healthfirsttech.com Company Description: Healthfirst Technologies is a pioneering company in product design and development across various domains. We specialize in end-to-end product development and management, with a focus on the healthcare and insurance industries. Our team of experts and over 20 years of strategic product engineering experience enable us to create best-of-breed products that address key business challenges. Requirements: Experience: 8+ years of experience in system administration, cloud infrastructure, and AWS services. Proven experience with AWS services including EC2, S3, RDS, VPC, CloudFormation, Lambda, and IAM. Strong background in Linux/Unix system administration. Certifications: AWS Certified SysOps Administrator Associate or equivalent certification. Additional AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer) are a plus. Technical Skills: Proficiency in scripting languages such as Python, Bash, or PowerShell. Experience with configuration management tools like Ansible, Puppet, or Chef. Knowledge of containerization technologies such as Docker and Kubernetes. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work independently and manage multiple priorities in a fast-paced environment. Preferred Qualifications: Experience with CI/CD pipelines and tools such as Jenkins or AWS CodePipeline. Familiarity with database management and optimization in both relational and NoSQL databases. Understanding of ITIL practices and experience in an IT service management role. Knowledge of network architecture and security principles. Qualifications: Bachelors degree in computer science, Information Technology, or related field. Proven experience in system operations and IT infrastructure management. Strong knowledge of AWS and Azure services. Experience with automation tools and scripting languages. Familiarity with security practices and risk management. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Preferred Skills: AWS and Azure certifications. Experience with DevOps practices and tools. Knowledge of network configurations and security protocols

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You have hands-on experience in AWS Cloud Java development and are an expert in implementing AWS services like EC2, VPC, S3, Lambda, Route53, RDS, Dynamo DB, ELB/ALB/NLB, ECS, SNS, SQS, CloudWatch, API Gateway etc. You also have knowledge on EFS / S3 for File storage and Cognito for authorization. Additionally, you have strong knowledge of Containerization and have worked on AWS ECS/ECR. You are proficient in inter-service communication through REST, gRPC, or messaging (SQS, Kafka). You have knowledge of writing Unit Test cases with JUNIT and strong notions of security best practices. Your expertise extends to AWS CDK and CDK Pipelines for IaC. You are capable of implementing service discovery, load balancing, and circuit breaker patterns. You have an understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services. Experience with CI/CD tools, DevOps implementation, and HA/DR setup is part of your skill set. You possess excellent communication and collaboration skills. Your responsibilities include hands-on experience with technologies like Java, Spring Boot, Rest API, JPA, Kubernetes, Messaging Systems, Tomcat/JBoss. You develop and maintain microservices using Java, Spring Boot, and Spring Cloud. You design RESTful APIs with clear contracts and efficient data exchange. Ensuring cloud security and compliance with industry standards is part of your routine. You maintain cloud infrastructure using AWS Cloud Development Kit (CDK) and implement security best practices, including data encryption and adherence to security protocols. Qualifications required for this role include 6+ years of hands-on experience in AWS Cloud Java development, expertise in implementing AWS services, strong knowledge of Containerization, and inter-service communication skills. You must have a solid understanding of security best practices, knowledge on File storage and authorization, and writing Unit Test cases. Familiarity with serverless approaches using AWS Lambda is also essential. Nice to have qualifications include proven expertise in AWS CDK and CDK Pipelines, implementing service discovery, load balancing, and circuit breaker patterns, familiarity with logging and monitoring services, experience with CI/CD tools, DevOps implementation, and HA/DR setup. Excellent communication and collaboration skills are a bonus to work effectively in a team-oriented environment. About Virtusa: Virtusa embodies values such as teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team of 27,000 people who care about your growth. You will have the opportunity to work on exciting projects, utilize state-of-the-art technologies, and advance your career with us. At Virtusa, great minds come together to nurture new ideas and foster excellence in a collaborative team environment.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Data Engineer2 at GoKwik, you will have the opportunity to closely collaborate with product managers, data scientists, business intelligence teams, and SDEs to develop and implement data-driven strategies. Your role will involve identifying, designing, and executing process improvements to enhance data models, architectures, pipelines, and applications. You will play a vital role in continuously optimizing data processes, overseeing data management, governance, security, and analysis to ensure data quality and security across all product verticals. Additionally, you will design, create, and deploy new data models and pipelines as necessary to achieve high performance, operational excellence, accuracy, and reliability in the system. Your responsibilities will include utilizing tools and technologies to establish a data architecture that supports new data initiatives and next-gen products. You will focus on building test-driven products and pipelines that are easily maintainable and reusable. Furthermore, you will design and construct an infrastructure for data extraction, transformation, and loading from various data sources, supporting the marketing and sales team. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Mathematics, or relevant computer programming training, along with a minimum of 4 years of experience in the Data Engineering field. Proficiency in SQL, relational databases, query authoring, data pipelines, architectures, and working with cross-functional teams in a dynamic environment is essential. Experience with Python, data pipeline tools, and AWS cloud services is also required. We are looking for individuals who are independent, resourceful, analytical, and adept at problem-solving. The ability to adapt to changing environments, excellent communication skills, and a collaborative mindset are crucial for success in this role. If you are passionate about tackling challenging problems at scale and making a significant impact within a dynamic and entrepreneurial setting, we welcome you to join our team at GoKwik.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You will be the Infrastructure & Performance Test Engineer responsible for designing, executing, and optimizing load, stress, and distributed testing strategies across cloud-based systems. Your expertise in HTTP/HTTPS traffic analysis, monitoring tools, and reporting, along with a solid understanding of AWS infrastructure and performance at scale, will be crucial for this role. Your key responsibilities will include planning and conducting Load Testing, Stress Testing, and Distributed Load Testing to simulate real-world traffic patterns. You will create and manage test datasets to ensure accurate simulations and validations, monitor and analyze HTTP/HTTPS calls and system metrics during test execution, and use tools like JMeter, Gatling, k6, or Locust for performance testing. Additionally, you will automate end-to-end test cases using Selenium for UI validation when necessary and collaborate with DevOps to test upgrades and infrastructure changes. You should possess 3-5 years of experience in performance/infrastructure testing or DevOps QA, proficiency with load testing tools such as JMeter, Gatling, k6, familiarity with Selenium for UI test automation, a strong understanding of HTTP/HTTPS protocols and API testing, and experience in AWS infrastructure and monitoring tools like CloudWatch and X-Ray. Moreover, experience in distributed test execution, parallel load generation, checking Latency and response time, and strong scripting skills in Python, Bash, or similar are required. Preferred qualifications include experience in continuous integration environments like Jenkins, GitHub Actions, exposure to Infrastructure as Code (IaC) tools such as Terraform or CloudFormation, and previous experience with major system upgrades and verifying post-upgrade performance baselines.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining as a talented SDE1 - DevOps Engineer with the exciting opportunity to contribute towards building a top-notch DevOps infrastructure that can scale to accommodate the next 100M users. As an ideal candidate, you will be expected to tackle a variety of challenges with enthusiasm and take full ownership of your responsibilities. Your main responsibilities will include running a highly available Cloud-based software product on AWS, designing and implementing new systems in close collaboration with the Software Development team, setting up and maintaining CI/CD systems, and automating the deployment of software. You will also be tasked with continuously enhancing the security posture and operational efficiency of the Amber platform, as well as optimizing the operational costs. To excel in this role, you should possess 2-3 years of experience in a DevOps / SRE role, with a minimum of 2 years. You must have hands-on experience with AWS services such as ECS, EKS, RDS, Elasticache, and CloudFront, as well as familiarity with Google Cloud Platform. Proficiency in Infrastructure as Code tools like Terraform, CI/CD tools like Jenkins and GitHub Actions, and scripting languages such as Python and Bash is essential. Additionally, you should have a strong grasp of SCM in GitHub, networking concepts, and experience with observability and monitoring tools like Grafana, Loki, Prometheus, and ELK. Prior exposure to On-Call Rotation and mentoring junior DevOps Engineers would be advantageous. While not mandatory, knowledge of NodeJS and Ruby, including their platforms and workflows, would be considered a plus for this role.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

kochi, kerala

On-site

As a Java Backend Developer in our team specializing in the IoT domain, your role will involve designing, developing, and deploying scalable microservices utilizing Spring Boot, SQL databases, and AWS services. You will play a pivotal role in guiding the backend development team, implementing DevOps best practices, and optimizing cloud infrastructure to ensure high-performance and secure services. Your key responsibilities will include architecting and implementing high-performance backend services using Java (Spring Boot), developing RESTful APIs and event-driven microservices with a focus on scalability and reliability, designing and optimizing SQL databases (PostgreSQL, MySQL), and deploying applications on AWS utilizing services like ECS, Lambda, RDS, S3, and API Gateway. In addition, you will be tasked with implementing CI/CD pipelines using tools such as GitHub Actions, Jenkins, or similar, monitoring and optimizing backend performance, ensuring best practices for security, authentication, and authorization using OAuth, JWT, and IAM roles, and collaborating with the team to maintain high standards of efficiency and quality. The ideal candidate will possess expertise in Java (Spring Boot, Spring Cloud, Spring Security), microservices architecture, API development, SQL (PostgreSQL, MySQL), ORM (JPA, Hibernate), DevOps tools (Docker, Kubernetes, Terraform, CI/CD, GitHub Actions, Jenkins), AWS cloud services (EC2, Lambda, ECS, RDS, S3, IAM, API Gateway, CloudWatch), messaging systems (Kafka, RabbitMQ, SQS, MQTT), testing frameworks (JUnit, Mockito, Integration Testing), and logging & monitoring tools (ELK Stack, Prometheus, Grafana). Preferred skills that would be beneficial for this role include experience in the IoT domain, previous work experience in startups, familiarity with event-driven architecture using Apache Kafka, knowledge of Infrastructure as Code (IaC) with Terraform, and exposure to serverless architectures. In return, we offer a competitive salary with performance-based incentives, the opportunity to lead and mentor a high-performing tech team, hands-on experience with cutting-edge cloud and microservices technologies, and a collaborative, fast-paced work environment where your skills and expertise will be valued and further developed. If you have experience in any IoT domain and are enthusiastic about contributing to a dynamic team focused on innovation and excellence, we invite you to apply for this full-time, on-site/hybrid Java Backend Developer position in Kochi.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

AppGlide is a growth partner for fast-growing software product companies globally, leveraging the latest technology and processes to enable SaaS companies to serve their customers better and grow faster. Located in Chennai, AppGlide is led by a team of IIT & IIM alumni. We are seeking a Python Engineering Manager to join our customer's solutions engineering team. In this role, you will collaborate closely with customers to deploy solutions across global customer deployments. Our client offers an industry-leading AIOps platform designed to provide instant, real-time actionable insights for managing multi-domain network and application infrastructures. By consolidating multiple data sources into a user-friendly platform, IT teams can troubleshoot network issues efficiently, minimize downtime, reduce MTTR, and enhance operational efficiency. The ideal candidate should have: - 2+ years of experience in managing and hiring engineers on teams with 5-10 reports, and a total of 6+ years in fast-paced engineering roles. - Familiarity with technologies such as Django, Python, React, Postgres, SQL, AWS, GCP, RDS, Docker, and Kubernetes. - A degree in Computer Science/Information Technology. - Experience in managing high-performance teams in startup-like environments. - Proficiency in planning team roadmaps and goals as a product-minded engineering leader. - Ability to collaborate with cross-functional departments to drive successful business outcomes. - Solid understanding of architecture, design, and data flow in applications for effective script design. - Capability to develop, debug, and maintain automation scripts/ code independently. Responsibilities include: - Expertise in the customer's operational analytics solution for networks, applications & security workflows. - Hands-on deployment of the product to meet customer requirements. - Collaboration with the engineering team on deployment plans for POC and leading to the completion of deployment. - Enhancing product features and quality through enhancement proposals and customer reviews. At AppGlide, we foster a culture of mutual respect and ownership, prioritizing employees" work-life balance and providing ownership of work streams. We invest in our employees" training and development with structured learning plans. This position is based in Chennai.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

6 - 12 Lacs

Noida

Work from Office

We are looking for an experienced Java Developer with strong experience in Amazon Web Services (AWS) to join our dynamic development team. As a Java AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications and services. You will work on a variety of AWS cloud technologies and Java frameworks to deliver high-performance, scalable, and secure solutions. Responsibilities: Develop, test, and deploy Java-based applications using AWS cloud services. Collaborate with cross-functional teams to design and implement cloud-native applications, microservices, and solutions using AWS services (EC2, S3, Lambda, RDS, SQS, SNS, etc.). Leverage AWS DevOps tools (such as Code Pipeline, Code Deploy, CloudFormation, etc.) to automate deployments and infrastructure provisioning. Write clean, efficient, and maintainable code using Java and related frameworks (Spring Boot, Hibernate). Troubleshoot, debug, and optimize applications running on AWS environments. Create and maintain detailed technical documentation. Monitor and improve the performance, scalability, and security of AWS-based applications. Follow best practices for cloud architecture and software development processes. Stay up to date with the latest AWS services and Java technologies. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). 3+ years of professional experience in Java development. 3+ years of hands-on experience with AWS services (EC2, S3, Lambda, RDS, etc.). Strong proficiency in Java programming language and Java frameworks like Spring Boot, Hibernate, etc. Experience with microservices architecture, containerization (Docker, Kubernetes), and RESTful API development. Familiarity with CI/CD pipelines and tools like Jenkins, Git, and AWS Code Pipeline. Experience with cloud security practices and ensuring the security of AWS services and applications. Knowledge of database management systems (SQL/NoSQL) and data modeling. Strong problem-solving skills and ability to troubleshoot complex issues in cloud environments. Excellent communication and collaboration skills. Preferred Skills: AWS certifications (AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate). Experience with infrastructure as code (IaC) tools like Terraform or CloudFormation. Familiarity with Agile methodologies and working in Agile teams. Knowledge of monitoring and logging tools like CloudWatch, ELK Stack, or similar. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Fundamental Technical Skills - Programming Multithreading Collections Database - Sql Server - SQL Packages Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Middleware - Java Middleware - Springboot Programming Language - Java - OOPS Concepts

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Pune

Work from Office

Key Responsibilities Design and develop scalable applications using Python and AWS services Debug and resolve production issues across complex distributed systems Architect solutions aligned with business strategies and industry standards Lead and mentor a team of India-based developers; guide career development Ensure technical deliverables meet highest standards of quality and performance Research and integrate emerging technologies and processes into development strategy Document solutions in compliance with SDLC standards using defined templates Assemble large, complex datasets based on functional and non-functional requirements Handle operational issues and recommend improvements in technology stack Facilitate end-to-end platform integration across enterprise-level applications Required Skills Technical Skills Cloud & Architecture Tools & Processes Python AWS (EC2, EKS, Glue, Lambda, S3, EMR, RDS, API Gateway) Terraform, CI/CD pipelines Data Engineering Step Functions, CloudFront EventBridge, ARRFlow, Airflow (MWAA), Quicksight Debugging & Troubleshooting System Integration SDLC, Documentation Templates Qualifications 10+ years of software development experience, preferably in financial/trading applications 5+ years of people management and mentoring experience Proven track record in technical leadership and architecture planning Expertise in developing applications using Python and AWS stack Strong grasp of Terraform and automated CI/CD processes Exceptional multitasking and prioritization capabilities

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 10+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10 years of experience. You should be proficient in setting up, configuring, and integrating API gateways in AWS. Your expertise should include API frameworks, XML/JSON, REST, and data protection in software design, build, test, and documentation. Experience with various AWS services such as Lambda, S3, CDN (CloudFront), SQS, SNS, EventBridge, API Gateway, Glue, and RDS is required. You should be able to articulate and implement projects using these AWS services effectively. Your role will involve improving business processes through effective integration solutions. Location: Bangalore, Chennai, Pune, Mumbai, Noida Notice Period: Immediate joiner If you meet the requirements mentioned above, please apply for this position by filling out the form with your Full Name, Email, Phone, Cover Letter, and uploading your CV/Resume (PDF, DOC, DOCX formats accepted). By submitting this form, you agree to the storage and handling of your data by this website.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a Senior Software Engineer at Elevance Health, a prominent health company in America dedicated to enhancing lives and simplifying healthcare. Elevance Health is the largest managed healthcare company in the Blue Cross Blue Shield (BCBS) Association, serving over 45 million lives across 14 states. This Fortune 500 company is currently ranked 20th and led by Gail Boudreaux, a prominent figure in the Fortune list of most powerful women. Your role will be within Carelon Global Solutions (CGS), a subsidiary of Elevance Health, focused on simplifying complex operational processes in the healthcare system. CGS brings together a global team of innovators across various locations, including Bengaluru and Gurugram in India, to optimize healthcare operations effectively and efficiently. As a Senior Software Engineer, your primary responsibility involves collaborating with data architects to implement data models and ensure seamless integration with AWS services. You will be responsible for supporting, monitoring, and resolving production issues to meet SLAs, being available 24/7 for business application support. You should have hands-on experience with technologies like Snowflake, Python, AWS S3-Athena, RDS, Cloudwatch, Lambda, and more. Your expertise should include handling nested JSON files, analyzing daily loads/issues, working closely with admin/architect teams, and understanding complex job and data flows in the project. To qualify for this role, you need a Bachelor's degree in Information Technology/Data Engineering or equivalent education and experience, along with 5-8 years of overall IT experience and 2-9 years in AWS services. Experience in agile development processes is preferred. You are expected to have skills in Snowflake, AWS services, complex SQL queries, and technologies like Hadoop, Kafka, HBase, Sqoop, and Scala. Your ability to analyze, research, and solve technical problems will be crucial for success in this role. Carelon promises limitless opportunities for its associates, emphasizing growth, well-being, purpose, and belonging. With a focus on learning and development, an innovative culture, and comprehensive rewards, Carelon offers a supportive environment for personal and professional growth. Carelon is an equal opportunity employer that values diversity and inclusivity. If you require accommodations due to a disability, you can request the Reasonable Accommodation Request Form. This is a full-time position that offers a competitive benefits package and a conducive work environment.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be a valuable member of the data engineering team, focusing on developing data pipelines, transforming data, exploring new data patterns, optimizing current data feeds, and implementing enhancements. Your primary responsibilities will involve utilizing your expertise in RDBMS concepts, hands-on experience with AWS Cloud platform and Services (including IAM, EC2, Lambda, RDS, Timestream, Glue, etc.), familiarity with data streaming tools like Kafka, practical knowledge of ETL/ELT tools, and understanding of Snowflake/PostgreSQL or any other database system. Ideally, you should also have a good grasp of data modeling techniques to further bolster your capabilities in this role.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

11 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5 to 10+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The client is a global technology consulting and digital solutions company with a vast network of entrepreneurial professionals spread across more than 30 countries. They cater to over 700 clients, leveraging their domain and technology expertise to drive competitive differentiation, enhance customer experiences, and improve business outcomes. As a part of the agile team, you will be responsible for developing applications, leading design sprints, and ensuring timely deliveries. Your role will involve designing and implementing low-latency, high-availability, and high-performance applications. You will also be required to ensure code modularity using microservices architecture in both frontend and backend development, following best practices in backend API development. Throughout the software development lifecycle, you will write code that is maintainable, clear, and concise. Your technical leadership will be crucial in mentoring team members to help them achieve their goals. Additionally, you will manage application deployment with a focus on security, scalability, and reliability. Your responsibilities will also include managing and evolving automated testing setups for backend and frontend applications to facilitate faster bug reporting and fixing. A solid understanding of RESTful API design, database design, and management, along with experience in version control systems, will be essential for this role. Strong problem-solving and communication skills, along with proficiency in object-oriented programming, C or VBNet, and writing reusable libraries are required. Familiarity with design and architectural patterns like Singleton and Factory patterns, RDBMS such as SQL, Postgres, MySQL, and writing clean, readable, and maintainable code will be beneficial. Experience in implementing automated testing platforms, unit tests, and identifying opportunities to optimize code and improve performance will be valuable assets. Understanding the best software engineering coding practices is essential for this position. Nice-to-have skills include proficiency in AWS services like EC2, S3, RDS, EKS, Lambda, CloudWatch, CloudFront, VPC, experience with Git, DevOps tools such as Jenkins, UCD, Kubernetes, ArgoCD, Splunk, and skills in NET-ReactJs.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,

Posted 3 weeks ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Hybrid

Role & responsibilities : We are seeking dynamic individuals to join our team as individual contributors, collaborating closely with stakeholders to drive impactful results. Working hours - 5:30 pm to 1:30 am (Hybrid model) Must have Skills* 1. 15 years of experience in design and delivery of Distributed Systems capable of handling petabytes of data in a distributed environment. 2. 10 years of experience in the development of Data Lakes with Data Ingestion from disparate data sources, including relational databases, flat files, APIs, and streaming data. 3. Experience in providing Design and development of Data Platforms and data ingestion from disparate data sources into the cloud. 4. Expertise in core AWS Services including AWS IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail, CloudWatch. 5. Proficiency in programming languages like Python and PySpark to ensure efficient data processing. preferably Python. 6. Architect and implement robust ETL pipelines using AWS Glue, Lamda, and step-functions defining data extraction methods, transformation logic, and data loading procedures across different data sources 7. Experience in the development of Event-Driven Distributed Systems in the Cloud using Serverless Architecture. 8. Ability to work with Infrastructure team for AWS service provisioning for databases, services, network design, IAM roles and AWS cluster. 9. 2-3 years of experience working with Document DB or MongoDB environment. Nice to have Skills: 1. 10 years of experience in the development of Data Audit, Compliance and Retention standards for Data Governance, and automation of the governance processes. 2. Experience in data modelling with NoSQL Databases like Document DB. 3. Experience in using column-oriented data file format like Apache Parquet, and Apache Iceberg as the table format for analytical datasets. 4. Expertise in development of Retrieval-Augmented Generation (RAG) and Agentic Workflows for providing context to LLMs based on proprietary enterprise data. 5. Ability to develop re-ranking strategies using results from Index and Vector stores for LLMs to improve the quality of the output. 6. Knowledge of AWS AI Services like AWS Entity Resolution, AWS Comprehend.

Posted 3 weeks ago

Apply

8.0 - 11.0 years

30 - 35 Lacs

Hyderabad

Work from Office

NP: Immediate to 15 Days. Required Skills & Qualifications: Strong experience in backend development using Java (Java 8 or later). Hands-on experience with front-end technologies such as Vue.js or Angular (with a strong preference to work with Vue.js). Solid understanding of PostgreSQL and ability to write optimized SQL queries and stored procedures. AWS cloud experience with knowledge of services like EC2, RDS, S3, Lambda, API Gateway, etc. Experience building and consuming RESTful APIs. Proficiency with version control systems (e.g., Git). Familiarity with Agile/Scrum methodologies. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

3 - 8 Lacs

Chennai

Hybrid

Key Responsibilities AWS Infrastructure Management Design, deploy, and manage AWS infrastructure using services such as EC2, ECS, EKS, Lambda, RDS, S3, VPC, and CloudFront Implement and maintain Infrastructure as Code using AWS CloudFormation, AWS CDK, or Terraform Optimize AWS resource utilization and costs through rightsizing, reserved instances, and automated scaling Manage multi-account AWS environments using AWS Organizations and Control Tower Implement disaster recovery and backup strategies using AWS services CI/CD Pipeline Development Build and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, CodeDeploy, and CodeCommit Integrate with third-party tools like Jenkins, GitLab CI, or GitHub Actions when needed Implement automated testing and security scanning within deployment pipelines Manage deployment strategies including blue-green deployments using AWS services Automate application deployments to ECS, EKS, Lambda, and EC2 environments Container and Serverless Management Deploy and manage containerized applications using Amazon ECS and Amazon EKS Implement serverless architectures using AWS Lambda, API Gateway, and Step Functions Manage container registries using Amazon ECR Optimize container and serverless application performance and costs Implement service mesh architectures using AWS App Mesh when applicable Monitoring and Observability Implement comprehensive monitoring using Amazon CloudWatch, AWS X-Ray, and AWS Systems Manager Set up alerting and dashboards for proactive incident management Configure log aggregation and analysis using CloudWatch Logs and AWS OpenSearch Implement distributed tracing for microservices architectures Create and maintain operational runbooks and documentation Security and Compliance Implement AWS security best practices using IAM, Security Groups, NACLs, and AWS Config Manage secrets and credentials using AWS Secrets Manager and Systems Manager Parameter Store Implement compliance frameworks and automated security scanning Configure AWS GuardDuty, AWS Inspector, and AWS Security Hub for threat detection Manage SSL/TLS certificates using AWS Certificate Manager Automation and Scripting Develop automation scripts using Python, Bash, and AWS CLI/SDK Create AWS Lambda functions for operational automation Implement event-driven automation using CloudWatch Events and EventBridge Automate backup, patching, and maintenance tasks using AWS Systems Manager Build custom tools and utilities to improve operational efficiency Required Qualifications AWS Expertise Strong experience with core AWS services: EC2, S3, RDS, VPC, IAM, CloudFormation Experience with container services (ECS, EKS) and serverless technologies (Lambda, API Gateway) Proficiency with AWS networking concepts and security best practices Experience with AWS monitoring and logging services (CloudWatch, X-Ray) Technical Skills Expertise in Infrastructure as Code using CloudFormation, CDK, or Terraform Strong scripting skills in Python, Bash, or PowerShell Experience with CI/CD tools, preferably AWS native services and Bitbucket Pipelines. Knowledge of containerization with Docker and orchestration with Kubernetes Understanding of microservices architecture and distributed systems Experience with configuration management and automation tools DevOps Practices Strong understanding of CI/CD best practices and GitOps workflows Experience with automated testing and deployment strategies Knowledge of monitoring, alerting, and incident response procedures Understanding of security scanning and compliance automation AWS Services Experience Compute & Containers Amazon EC2, ECS, EKS, Fargate, Lambda, Batch Storage & Database Amazon S3, EBS, EFS, RDS, DynamoDB, ElastiCache, Redshift Networking & Security VPC, Route 53, CloudFront, ALB/NLB, IAM, Secrets Manager, Certificate Manager Developer Tools CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CodeArtifact Monitoring & Management CloudWatch, X-Ray, Systems Manager, Config, CloudTrail, AWS OpenSearch

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Remote

Skillset: PostgreSQL, Amazon Redshift, MongoDB, Apache Cassandra,AWS,ETL, Shell Scripting, Automation, Microsoft Azure We are looking for futuristic, motivated go getters having following skills for an exciting role. Job Description: Monitor and maintain the performance, reliability, and availability of multiple database systems. Optimize complex SQL queries, stored procedures, and ETL scripts for better performance and scalability. Troubleshoot and resolve issues related to database performance, integrity, backups, and replication. Design, implement, and manage scalable data pipelines across structured and unstructured sources. Develop automation scripts for routine maintenance tasks using Python, Bash, or similar tools. Perform regular database health checks, set up alerting mechanisms, and respond to incidents proactively. Analyze performance bottlenecks and resolve slow query issues and deadlocks. Work in DevOps/Agile environments, integrating with CI/CD pipelines for database operations. Collaborate with engineering, analytics, and infrastructure teams to integrate database solutions with applications and BI tools. Research and implement emerging technologies and best practices in database administration. Participate in capacity planning, security audits, and software upgrades for data infrastructure. Maintain comprehensive documentation related to database schemas, metadata, standards, and procedures. Ensure compliance with data privacy regulations and implement robust disaster recovery and backup strategies. Desired skills: Database Systems: Hands-on experience with SQL-based databases (PostgreSQL, MySQL), Amazon Redshift, MongoDB, and Apache Cassandra. Scripting & Automation: Proficiency in scripting using Python, Shell, or similar to automate database operations. Cloud Platforms: Working knowledge of AWS (RDS, Redshift, EC2, S3, IAM,Lambda) and Azure SQL/Azure Cosmos DB. Big Data & Distributed Systems: Familiarity with Apache Spark for distributed data processing. Performance Tuning: Deep experience in performance analysis, indexing strategies, and query optimization. Security & Compliance: Experience with database encryption, auditing, access control, and GDPR/PII policies. Familiarity with Linux and Windows server administration is a plus. Education & Experience: BE, B.Tech, MCA, Mtech from Tier 2/3 colleges & Science Graduates 5-8 years of work experience.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Title : Azure Presales Engineer. About the Role : As a Cloud Presales Engineer specializing in Azure, you will play a critical role in our sales process by working closely with sales and technical teams to provide expert guidance and solutions for our clients. Leveraging your in-depth knowledge on Azure services, you will understand customer needs, design tailored cloud solutions, and drive the adoption of our cloud offerings. This position requires strong technical acumen, excellent communication skills, and a passion for cloud technologies. Key Responsibilities Solution Design and Architecture : Understand customer requirements and design effective cloud solutions using Azure services. Create architecture diagrams and detailed proposals tailored to customer needs. Collaborate with sales teams to define the scope of technical solutions and present them to customers. Technical Expertise And Consultation Act as a subject matter expert on AWS and Azure services, including EC2, S3, Lambda, RDS, VPC, IAM, CloudFormation, Azure Virtual Machines, Blob Storage, Functions, SQL Database, Virtual Network, Azure Active Directory, and ARM Templates. Provide technical support during the sales process, including product demonstrations, POCs (Proof of Concepts), and answering customer queries. Advise customers on best practices for cloud adoption, migration, and optimization. Customer Engagement Build and maintain strong relationships with customers, understanding their business challenges and technical needs. Conduct customer workshops, webinars, and training sessions to educate customers on Azure solutions and services. Gather customer feedback and insights to help shape product and service offerings. Sales Support Partner with sales teams to develop sales strategies and drive cloud adoption. Prepare and deliver compelling presentations, demonstrations, and product pitches to customers. Assist in the preparation of RFPs, RFQs, and other customer documentation. Continuous Learning And Development Stay up-to-date with the latest AWS and Azure services, technologies, and industry trends. Achieve and maintain relevant AWS and Azure certifications to demonstrate expertise. Share knowledge and best practices with internal teams to enhance overall capabilities. Required Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in a presales or technical consulting role, with a focus on cloud solutions. In-depth knowledge of AWS and Azure services, with hands-on experience in designing and implementing cloud-based architectures. Azure certifications (i.e. Microsoft Certified : Azure Solutions Architect Expert) are highly preferred. Strong understanding of cloud computing concepts, including IaaS, PaaS, SaaS, and hybrid cloud models. Excellent presentation, communication, and interpersonal skills. Ability to work independently and collaboratively in a fast-paced, dynamic environment. Preferred Qualifications Experience with other cloud platforms (i.e., Google Cloud) is a plus. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (IaC) using Terraform, CloudFormation, and ARM Templates. Experience with cloud security, compliance, and governance best practices. Background in software development, scripting, or system administration. Join us to be part of an innovative team, shaping cloud solutions and driving digital transformation for our clients!. (ref:hirist.tech),

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for planning, implementing, and growing the AWS cloud infrastructure. Your role will involve building, releasing, and managing the configuration of all production systems. It will be essential to manage a continuous integration and deployment methodology for server-based technologies. Collaboration with architecture and engineering teams to design and implement scalable software services will also be part of your responsibilities. Ensuring system security through the utilization of best-in-class cloud security solutions will be crucial. Staying up to date with new technology options and vendor products is important, and you will be expected to evaluate which ones would be suitable for the company. Implementing continuous integration/continuous delivery (CI/CD) pipelines when needed will also fall under your purview. You will have the opportunity to recommend process and architecture improvements, troubleshoot the system, and resolve problems across all platform and application domains. Overseeing pre-production acceptance testing to maintain the high quality of the company's services and products will be part of your duties. Experience with Terraform, Ansible, GIT, and Cloud Formation will be beneficial for this role. Additionally, a solid background in Linux/Unix and Windows server system administration is required. Configuring the AWS CloudWatch and monitoring, creating and modifying scripts, and hands-on experience with MySQL are also essential skills. You should have experience in designing and building web environments on AWS, including working with services like EC2, ELB, RDS, and S3. This is a full-time position with benefits such as Provident Fund and a yearly bonus. The work schedule is during the day shift, and the preferred experience level for AWS is 3 years. The work location is in person.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a full-time on-site role for a PHP Laravel Developer based in Chennai. In this position, you will play a key role in developing and maintaining web applications utilizing the Laravel framework. Your responsibilities will include coding, debugging, testing, and deploying new features. Additionally, you will collaborate with cross-functional teams to create efficient and scalable solutions. To excel in this role, you must possess a strong proficiency in PHP and have hands-on experience with the Laravel framework. Familiarity with frontend technologies like HTML, CSS, and JavaScript is essential. Moreover, knowledge of database management systems, particularly MySQL, is required. Understanding RESTful APIs, integrating third-party services, and using version control systems like Git are also important aspects of this position. Candidates should have practical experience in schema design, query optimization, REST API, and AWS services such as EC2, S3, RDS, Lambda, and Redis. Proficiency in designing scalable and secure web applications, expertise in automated testing frameworks, and a solid grasp of web security practices are crucial for success in this role. The ideal candidate will be able to prioritize tasks effectively and work both independently and collaboratively as part of a team. Strong problem-solving and troubleshooting skills are essential, as is clear communication and the ability to work with others. A Bachelor's degree in computer science or a related field, or equivalent experience, is required. Requirements: - Strong proficiency in PHP with Laravel framework - Experience in HTML, CSS, and JavaScript - Knowledge of MySQL and RESTful APIs - Familiarity with Git and version control systems - Hands-on experience with schema design, query optimization, and REST API - Profound knowledge of AWS services - Demonstrated experience in designing scalable and secure web applications - Expertise in automated testing frameworks - Strong understanding of web security practices - Ability to prioritize tasks and work independently or as part of a team - Excellent problem-solving and troubleshooting skills - Good communication and collaboration skills - Bachelor's degree or equivalent experience in computer science or related field Experience: 4+ Years Location: Chennai/Madurai Interested candidates can share CV at anushya.a@extendotech.com / 6374472538 Job Type: Full-time Benefits: Health insurance, Provident Fund Location Type: In-person Schedule: Morning shift Work Location: In person,

Posted 3 weeks ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies