Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 12.0 years
15 - 30 Lacs
Pune, Ahmedabad
Work from Office
We are seeking a seasoned Lead Platform Engineer with a strong background in platform development and a proven track record of leading technology design and teams. The ideal candidate will have at least 8 years of overall experience, with a minimum of 5 years in relevant roles. This position entails owning module design and spearheading the implementation process alongside a team of talented platform engineers. Job Title: Lead Platform Engineer Job Location: Ahmedabad/Pune (Work from Office) Required Experience: 7+ Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Lead the design and architecture of robust, scalable platform modules, ensuring alignment with business objectives and technical standards. Drive the implementation of platform solutions, collaborating closely with platform engineers and cross-functional teams to achieve project milestones. Mentor and guide a team of platform engineers, fostering an environment of growth and continuous improvement. Stay abreast of emerging technologies and industry trends, incorporating them into the platform to enhance functionality and user experience. Ensure the reliability and security of the platform through comprehensive testing and adherence to best practices. Collaborate with senior leadership to set technical strategy and goals for the platform engineering team. Requirements: Minimum of 8 years of experience in software or platform engineering, with at least 5 years in roles directly relevant to platform development and team leadership. Expertise in Python programming, with a solid foundation in writing clean, efficient, and scalable code. Proven experience in serverless application development, designing and implementing microservices, and working within event-driven architectures. Demonstrated experience in building and shipping high-quality SaaS platforms/applications on AWS, showcasing a portfolio of successful deployments. Comprehensive understanding of cloud computing concepts, AWS architectural best practices, and familiarity with a range of AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Exceptional problem-solving skills, with a proven ability to optimize complex systems for efficiency and scalability. Excellent communication skills, with a track record of effective collaboration with team members and successful engagement with stakeholders across various levels. Previous experience leading technology design and engineering teams, with a focus on mentoring, guiding, and driving the team towards achieving project milestones and technical excellence. Good to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.
Posted 1 week ago
4.0 - 5.0 years
4 - 6 Lacs
Vadodara
Work from Office
About the job: We are looking for a DevOps Engineer to maintain, upgrade, and manage our software, hardware, and networks with a strong focus on AWS and containerization. Resourcefulness and problem-solving are essential in this role. You should be able to diagnose and resolve issues quickly, while collaborating with interdisciplinary teams and users. Your goal will be to ensure that our AWS-based infrastructure, including containerized environments, runs smoothly, securely, and efficiently. Responsibilities: Monitor and maintain systems, including configuration, security management, patching, automation, hardening, and upgrades. Set up and manage infrastructure on AWS, including EC2, RDS, S3, Route 53, and other AWS services. Manage containerized environments using Docker and Kubernetes (EKS). Build and manage CI/CD pipelines using Jenkins, particularly for containerized applications. Ensure system uptime, availability, reliability, and security across AWS environments and containerized workloads. Manage cloud monitoring and logging using AWS CloudWatch. Handle backup and disaster recovery strategies for AWS-based applications and databases. Collaborate with vendors and IT providers for specific AWS and containerization requirements. Document procedures, policies, and configurations in an internal wiki. Troubleshoot server-side and cloud-related issues, including container orchestration problems. Asset management and cost optimization in AWS. Skills: Proven experience as a DevOps Engineer or System Administrator with a focus on AWS and containerization. In-depth knowledge of AWS services such as EC2, EKS (Kubernetes), RDS, S3, Load Balancers, CloudWatch, and Route 53. Strong experience with Docker and Kubernetes (preferably EKS). Experience with AWS security best practices (IAM, Security Groups, VPC). Familiarity with CI/CD pipelines and tools like Jenkins, especially for containerized environments. Strong scripting skills (Bash, Python, or PowerShell) for automation on AWS. Knowledge of system security, data backup, and recovery. Experience with databases, networks (LAN, WAN), and patch management. Excellent communication skills. Nice to Have: Experience with Azure services, such as Azure Kubernetes Service (AKS), Virtual Machines, and Azure DevOps. Familiarity with Infrastructure as Code (IaC) tools such as Terraform, AWS CloudFormation, or Azure Resource Manager (ARM) templates. Azure certifications such as Azure Administrator Associate or Azure Solutions Architect Expert. Basic knowledge of Azure Active Directory, role-based access control (RBAC), and Azure Networking. Benefits: Health insurance and personal accident insurance. Flexible working hours in a motivational environment. Paid time off and referral programs. Discover a rewarding work/life balance.
Posted 1 week ago
5.0 - 7.0 years
10 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Data Engineer: Mandatory skills* AWS, KAFKA, ETL, Glue, Lamda, Phyton, SQL
Posted 1 week ago
10.0 - 15.0 years
9 - 15 Lacs
Chennai, Bengaluru
Work from Office
Roles & Responsibilities Lead data migration efforts from legacy systems (e.g., on-premises databases) to cloud-based platforms AWS Collaborate with cross-functional teams to gather requirements and define migration strategies. Develop and implement migration processes to move legacy applications and data to cloud platforms like AWS. Write scripts and automation to support data migration, system configuration, and cloud infrastructure provisioning. Optimize existing data structures and processes for performance and scalability in the new environment. Ensure the migration adheres to performance, security, and compliance standards. Identify potential issues, troubleshoot, and implement fixes during the migration process. Maintain documentation of migration processes and post-migration maintenance plans. Provide technical support post-migration to ensure smooth operation of the migrated systems. Primary Skills (Required): Proven experience in leading data migration projects and migrating applications, services, or data to cloud platforms (preferably AWS). Knowledge of migration tools such as AWS Database Migration Service (DMS), AWS Server Migration Service (SMS), AWS Migration Hub Expertise in data mapping, validation, transformation, and ETL processes Proficiency in Python, Java or similar programming languages. Experience with scripting languages such as Shell, PowerShell, or Bash Cloud Technologies (AWS focus): Strong knowledge of AWS services relevant to data migration (e.g., S3, Redshift, Lambda, RDS, DMS, Glue). Experience in working with CI/CD pipelines (Jenkins, GitLab CI/CD) and infrastructure as code (IaC) using Terraform or AWS CloudFormation Experience in database management and migrating relational (e.g., MySQL, PostgreSQL, Oracle) and non-relational (e.g., MongoDB) databases.
Posted 2 weeks ago
3.0 - 6.0 years
10 - 17 Lacs
Noida
Hybrid
The right person for this position should have 3-6 years of experience in Backend Development . He/She should be passionate, tech savvy, academically sound, have interest in cloud technologies like AWS & Azure and technologies that drive the headless domain. He / She should be able to understand Pentair product domain and develop products using industry best practices. He / She is required to be hands-on with node.js, go-lang or python and application development to develop service for SAAS based platform for residential and commercial IOT. He should be able to define Low level design for any problem statement. Roles and Responsibilities: Develop the Smart Products& IoT Technology within the Segment. Responsibility for successful execution of Segment-focused projects aimed at developing Smart products and IoT solutions; such projects/products include fully developed commercial products, minimum viable products, rapid prototypes and proof-of-concepts. Ensure these projects follow the appropriate standard process used at Pentair(such as 3D and Rapid3D). Continuously innovate on existing IoT solutions with latest applicable techniques to boost product capabilities and business value. Collaborate with Pentair-wide technical resources to develop IOT cloud, web and mobile solutions and support IOT Product solutions: Design & Develop technical design document for software development. Develop detailed technical architecture block diagram for cloud solutions. Code & implement the application layer as Infrastructure as Code and Software as a Service solutions. Implement production platform building back-end automation tools. Review product applications, create test platforms to review coding quality. Coordinate with Product Engineer from Filtration Business Unit to develop project plan to integrate IOT. Support product risk assessment, develop guide for IOT design requirement, support developing test plan for integration Provide solutions to issues related to the connection of networks and platforms. Skills required: Bachelors degree in computer science or equivalent. More than 3 years of working experience in Amazon Web Service Infrastructure and Platform as a Service tools. 3+ years' experience with the programming languages -Python, Java, NodeJS Extensive experience working with Cloud based datastores like S3, DynamoDB, MongoDB Deep understanding of mobile and web technology stacks Swagger API specifications, Restful API In-depth understanding of computer programming and network security Expert understanding of data modeling, database design, performance monitoring and tuning. Experience in collaborating with global technology teams is a plus Deep technical knowledge in Agile software development. Experience in working with IoT vendors /third party service providers is a plus. Willingness to travel up to 20% of the time to other Pentair sites and customer locations. Travel will both domestic and international. Key Interfaces Global Project team members GEC Engineering Team External vendors and suppliers Qualifications and Experience: M.Tech/B.Tech in Computer Science / Electronics Engineering from a good Engineering College. Other Requirements: Team player Good communication and presentation skills Ability to multitask Design Thinking Have passion for Design & Technology Should have a can do attitude Excellent interpersonal skills
Posted 2 weeks ago
5.0 - 8.0 years
12 - 18 Lacs
Pune, Delhi / NCR
Hybrid
5+ yrs of exp in deploying, enhancing, & troubleshooting AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS) 3+ yrs of exp with serverless technologies, services,Docker, Kubernetes exp in JavaScript, Bash, Python, Typescript
Posted 2 weeks ago
6.0 - 10.0 years
0 - 2 Lacs
Gurugram
Remote
We are seeking an experienced AWS Data Engineer to join our dynamic team. The ideal candidate will have hands-on experience in building and managing scalable data pipelines on AWS, utilizing Databricks, and have a deep understanding of the Software Development Life Cycle (SDLC) and will play a critical role in enabling our data architecture, driving data quality, and ensuring the reliable and efficient flow of data throughout our systems. Required Skills: 7+ years comprehensive experience working as a Data Engineer with expertise in AWS services (S3, Glue, Lambda etc.). In-depth knowledge of Databricks, pipeline development, and data engineering. 2+ years of experience working with Databricks for data processing and analytics. Architect and Design the pipeline - e.g. delta live tables Proficient in programming languages such as Python, Scala, or Java for data engineering tasks. Experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Experience with ETL/ELT tools and processes in a cloud environment. Familiarity with Big Data processing frameworks (e.g., Apache Spark). Experience with data modeling, data warehousing, and building scalable architectures. Understand/implement security aspects - consume data from different sources Preferred Qualifications: Experience with Apache Airflow or other workflow orchestration tools, Terraform , python, spark will be preferred AWS Certified Solutions Architect, AWS Certified Data Analytics Specialty, or similar certifications.
Posted 2 weeks ago
4.0 - 8.0 years
9 - 19 Lacs
Gurugram, Chennai, Bengaluru
Work from Office
Skills: AWS Glue, Lambda, PySpark, Python , SQL
Posted 2 weeks ago
9.0 - 11.0 years
6 - 7 Lacs
Raipur
Work from Office
Job Tille: System Administrator-Big Data (SA-BD) Reports to: The Joint Chief Executive Officer. CHiPS Number of Positions: I Responsibility Summary: Chhattisgarh infotech Promotion Society, Government of Chhattisgarh invites applications from enterprising and aspiring candidates for the position of System Administrator (SA-8D). Chhattisgarh infotech Promotion Society (CHiPS: www,chips.gov.in) is the nodal agency and prime mover for propelling IT growth & implementation of the IT & e-Governance projects in the State of Chhattisgarh. CHiPS is involved in the end-to-end implementation of some mega IT Projects like, SOC, SSDG. SWAN, GTS,e-Procurement etc. A professional approach is being adopted for the implementation of IT Projects using the services of e-governance experts and consultants from corporate and academia. ICT has the potential to significantly improve this contribution. In doing so. Government of Chhattisgarh seeks to create an IT environment in the stare wherein investments in IT are not only encouraged but actively facilitated. We aim to achieve quality and excellence in state government services for the citizens, state transactions with citizens and businesses, and internal state governmental operations/functions through the Strategic deployment of information technologies. The role of the SA-BD is to ensure that the strategic and organizational objectives as well as the values of CHiPS are put into practice. In conjunction with other 1nembers of staff they will ensure organization growth through directing and managing• operational activities to ensure they are delivered in accordance with the strategic objectives. The SA-BD will.be responsible for monitoring the systems to ensure the highest level of infrastructure performance, manages and coordinates all infrastructure projects to meet client needs, ensuring that standards and procedures arc followed during design and implementation of information systems. helping and creating organizational and program budgets• in collaboration with the JCEO and other team members, and undertaking other miscellaneous tasks and when they arise. They are required to work with the staff team, and contribute to the development and implementation of organizational strategies, policies and practices. The candidate should be an outcome - oriented executive. capable of leading the creation of and energizing the institutional, human and technical capacities necessary to realize. the unique and ambitious ICT agenda of the State. S/he should also be familiar with a variety of the field's concepts, practices, and procedures. relies on extensive experience and judgment to plan and accomplish goals and capable of multi-tasking. A wide degree of creativity and latitude is expected 10 secure the necessary cooperation and convergence of resources and activities. The candidate should fulfil the mandatory requirements listed below. and should embody n rich combination of the requirements listed as desired. Mandatory: Educational qualification: B.E. /B.Tcch. (Information Technology/ Computer Science/• Electronics & IT) or M.E/ M. Tech or M. Sc. in Mathematics/statistics/Operation Research/ Computer Science/ IT or Ph.d in a quantitative discipline (such as Computer Science, Bioinformatics or Statistics), recognized by or under• the regulations etc. of relevant regulatory body, obtained upon successful completion of studies (excluding studies in distance education mode) as a regularly enrolled student; in respect of degrees or diplomas awarded abroad. candidate should submit relevant details establishing equivalence with the above qualification, and the decision of the Selection Committee regarding the acceptability of such qualification as equivalent qualification shall be final. Age : Candidate should be energetic and dynamic as the job profile would entail extensive interaction with various stakeholders and should be result oriented. S/he should not be more than 35 years of age on the date of issue of the recruitment notice. For age related relaxations, please refer the Recruitment Rules. Experience: At least 9 years in case of Bachelor's degree or 7 years' experience in case of Master's degree; on collecting, storing, processing. and analyzing of huge sets of data. The candidate must have at least 4.5 years of relevant work experience as Data Analyst/ Scientist or similar quantitative analysis positions. The primary focus will be on choosing Optimal solutions to use for these purposes, then maintaining. implementing. and monitoring them along with the responsibility for integrating them with the architecture used across the organization. The candidate should be highly technical computing architects to collaborate with our customers and partners on solutions in Big Data and Analytics. These engage1rieiits will focus on Real Time and Batch-based Big Data processing. Business intelligence(HPE:; and Machine Learning. This role will specifically focus on building innovative solutions that focus on leveraging the value of data. Job Description: The• right person will be highly technical and analytical, possess significant experience of software development and/or IT and networking implementation / consulting experience. Strong knowledge of and experience with statistics; potentially other advanced math as well. Translate numbers to insights Programming experience, ideally in Python or Java and R. Build a new innovative data platform based on big data technologies Build development methodologies for company-wide principles and strategy. Drive innovation around data management and processing Successfully lead team of big data developers Create technology specific roadmaps as directed • Coordinate and work with DevOps team 10 build and manage data infrastructure Coordinate and work together with data science and reporting teams Deep knowledge. in data mining, machine learning, natural language processing; or information retrieval. Experience processing large amounts of structured and unstructured data. Map Reduce experience is a plus. Enough programming knowledge to clean and scrub noisy datasets. Work with state government department to gauge nod create demand of strategic solutions and fulfil 1hc same. Must have the ability to reach out and work with the senior government official in various departments. To manage implementation of identified projects based on a broad and detailed knowledge of current and emerging technologies and to provide technical input into projects undertaken by or impacting on organization. To advise and inform management, department and Members on technical issues as part of the decision making process for technical direction and procurement of new systems. Desirable: Experience working within 1he software development or internet industries is highly desired; working knowledge of modern software development practices and 1ecltnologies such as agile methodologies and DevOps is highly desired. Understanding ofapplica1ion, server, and network security is highly desired. Technical - Web services development/deployment experience, IT Systems and network engineering experience, security and compliance experience, etc. Operational - Website/ web services as well as1radi1ional IT networking, operations. management, and security experience. Economic and business - RF'P/ Acquisition support; market analysis: cost benefit Knowledge of the underlying infras1ruc1ure requirements such as networking Storage, 3nd Hardware Optimization. Ability to think strategically about business, product, and technical challenges in an enterprise environment. Understanding of Agile methodologies. and 1the ability to apply these practices to Analytics projects. Implementation and tuning experience in the Apache Hadoop Ecosystem, including tools such as Hadoop Streaming, Spark. Pig and Hive Implementation and tuning experience of Data Warehousing platforms, including knowledge of Data Warehouse Schema Design. Query Tuning and Optimization, and Data Migration and Integration. Experience of requirements for the analytics presentation layer including Dashboards. Reporting, and OLAP. Technical Skills: Mnnagement of Hadoop cluster, with all included services. Ability to solve any ongoing issues with operating the cluster. Proficiency with Hadoop v2. MapReduce, HDFS. Experience with building stream-processing systems. using solutions such as "Storm or Spark-Streaming. Good knowledge of Big Data querying tools. such as Pig. Hive, and impala • Experience with Spark. Experience with integration of data from multiple data sources. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB. Knowledge of various ETL techniques and frameworks, such as Flume. . ... Experience with various messaging systems. such as Kafka or RabbitMQ. Experience with Big Data ML toolkits, such as Mahout SparkML, or H20. . Good understanding of Lambda Architecture, along with its advantages and , drawbacks • Experience with Cloudera/ MapR/ Hortonworks. KINDLY NOTE: APPLICATION ARE ACCEPTED ON OR BEFOR JUNE 7th 2025 ONLY.
Posted 3 weeks ago
6.0 - 9.0 years
10 - 20 Lacs
Hyderabad
Work from Office
This requirement to source profiles with 6-9years of overall experience, including minimum of 4years in Data engineering. Added the below point based on observation Look for combinations with Informatica, IICS , Python,(If not Informatica we can submit with Talend,) Pyspark, SQL , Step Functions, Lambda & EMR on high level exp - Location Hyderabad Key responsibilities and accountabilities Design, build and maintain complex ELT/ETL jobs that deliver business value. Extract, transform and load data from various sources including databases, APIs, and flat files using IICS or Python/SQL. Translate high-level business requirements into technical specs Conduct unit testing, integration testing, and system testing of data integration solutions to ensure accuracy and quality Ingest data from disparate sources into the data lake and data warehouse Cleanse and enrich data and apply adequate data quality controls Provide technical expertise and guidance to team members on Informatica IICS/IDMC and data engineering best practices to guide the future development of MassMutuals Data Platform. Develop re-usable tools to help streamline the delivery of new projects Collaborate closely with other developers and provide mentorship Evaluate and recommend tools, technologies, processes and reference Architectures Work in an Agile development environment, attending daily stand-up meetings and delivering incremental improvements Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications and provide feedback on code quality, design, and performance Knowledge, skills and abilities Please refer ‘Education and Experience’ Education and experience Bachelor’s degree in computer science, engineering, or a related field. Master’s degree preferred. Data: 5+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts. SQL: 5+ years of hands-on experience on SQL and query optimization for data pipelines. ELT/ETL: 5+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC Migration Experience: Experience Informatica on prem to IICS/IDMC migration Cloud: 5+ years’ experience working in AWS cloud environment Python: 5+ years of hands-on experience of development with Python Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Airflow) Advanced Data Processing: Experience using data processing technologies such as Apache Spark or Kafka Troubleshooting: Experience with troubleshooting and root cause analysis to determine and remediate potential issues Communication: Excellent communication, problem-solving and organizational and analytical skills Able to work independently and to provide leadership to small teams of developers. Reporting: Experience with data reporting (e.g. MicroStrategy, Tableau, Looker) and data cataloging tools (e.g. Alation) Experience in Design and Implementation of ETL solutions with effective design and optimized performance, ETL Development with industry standard recommendations for jobs recovery, fail over, logging, alerting mechanisms. Specify the minimum acceptable level of education, experience, certifications necessary for the role Application Requirements No special requirements Support Hours India GCC – US (EST) hours overlap for 2-3 hours
Posted 3 weeks ago
10.0 - 15.0 years
22 - 37 Lacs
Hyderabad
Work from Office
Senior Java AWS Technical Lead OneTax Platform About the Role: Join our OneTax engineering team as a Senior Java AWS Technical Lead. You will drive the design and development of scalable, resilient tax processing systems on AWS. As a hands-on technical leader, you will shape architecture, mentor engineers, and ensure delivery excellence in a high impact, compliance driven environment. Key Responsibilities : Lead and mentor a team of engineers, fostering technical growth and best practices. Architect and design robust, secure microservices and APIs using Java, Spring Boot, and AWS. Deliver hands on solutions for complex business challenges, ensuring high performance and reliability. Integrate with internal and external systems, leveraging AWS services (EC2, Lambda, RDS, DynamoDB, S3, SQS, etc.). Drive DevOps, CI/CD, and automation for rapid, safe deployments. Collaborate with product, QA, and cross functional teams to deliver high quality features on schedule. Troubleshoot and optimize distributed systems for scalability and cost efficiency. Contribute to technical strategy, evaluating new technologies and driving innovation. Required Qualifications: Bachelor’s or master’s in computer science or related field. 10+ years of software development experience, with deep expertise in Java and Spring Boot. Proven technical leadership and mentoring experience. Strong background in AWS architecture and cloud native development. Experience with microservices, RESTful APIs, and distributed systems. Proficiency in relational and NoSQL databases. Solid understanding of DevOps, CI/CD, and containerization (Docker, ECS/EKS). Excellent problem solving, debugging, and communication skills. Experience working in Agile teams. Preferred: Experience in Financial Services or Tax domain. AWS certifications (Associate/Professional). Familiarity with monitoring, logging, and tracing tools. Contributions to open-source or technical community. Why OneTax? Being part of a global leader, building mission critical Tax solutions that impact millions. You’ll work with cutting edge technology, a talented team, and could shape the future of tax processing.
Posted 3 weeks ago
8.0 - 13.0 years
17 - 25 Lacs
Pune
Remote
Senior Java Developer Job mode: Remote (EST working hrs) Notice - 30 days Job Description: • Should have strong hands-on experience of 810 years in Java development. • Should have strong knowledge of Java 11+, Spring, Spring Boot, Hibernate, REST Webservices. • Should have strong knowledge of J2EE design patterns and Microservices design patterns. • Should have strong hands-on knowledge of SQL/PostGreSQL DB. Good to have exposure to NoSQL DB. • Should have strong knowledge of AWS services (Lambda, EC2, RDS, API Gateway, S3, CloudFront, Airflow). • Good to have hands-on knowledge on React. • Good to have exposure to Python, Pyspark as a secondary skill. • Implement solutions that are aligned with business/IT strategies and comply with Nuveen’s Architectural standards. • Should have good knowledge of CI-CD Pipeline. • Should be strong in writing unit test cases, debug Sonar issues. • Should be able to lead/guide team of junior developers. • Should be able to collaborate with BA and Solution Architects to create HLD and LLD documents. • Should be able to create architecture flow diagram, logical flow diagram and data flow diagrams. • Good to have experience in Asset Management Domain.
Posted 3 weeks ago
10 - 17 years
37 - 55 Lacs
Bengaluru
Remote
Role & responsibilities 1. Overall, 12 to 16 years of C++ Development experience 2. Experience managing a team of 10 3. Must have handled scrum calls 4. Experience with helping Team resolve technical queries. 5. Experience in resolving complex Technical issues 6. Must have handled Projects independently. 7. Experience working with US/UK Clients 8. Ability to install and configure additional software and packages on Linux primarily those needed for coding, testing, dumping memory footprint, debugging etc. 9. Agile/Scrum experience 10. Ability to develop and triage on Linux 11. Ability to setup Linux IDE 12. Ability to integrate IDE with Source Code system such as Clearcase 13. Ability to debug, test, compile and rerun the modified executables on Linux OS 14. Ability to code and test in C++. 15. Nice to have docker knowledge and ability to deploy containers and software in it.
Posted 1 month ago
5 - 10 years
5 - 15 Lacs
Hyderabad
Hybrid
We are seeking an experienced Senior DevOps Engineer with deep expertise in building automation and CI/CD pipelines within a serverless AWS environment . The ideal candidate will have hands-on experience managing AWS Lambda at scale , designing infrastructure with AWS CDK , and implementing pipelines using GitHub Actions . This role will play a key part in scaling, securing, and optimizing our cloud-native architecture. Key Responsibilities: Design, implement, and maintain robust CI/CD pipelines using GitHub Actions and AWS CDK . Build and manage serverless applications with a focus on scalability, performance, and reliability . Configure and maintain key AWS services including: IAM, API Gateway, Lambda (600+ functions), SNS, SQS, EventBridge, CloudFront, S3, RDS, RDS Proxy, Secrets Manager, KMS, and CloudWatch . Develop infrastructure as code (IaC) using AWS CDK and CloudFormation Templates . Code primarily in TypeScript , with additional scripting in Python as needed. Implement and optimize DynamoDB and other AWS-native databases. Enforce best practices for cloud security, monitoring, and cost optimization. Collaborate with development, QA, and architecture teams to enhance deployment workflows and reduce release cycles. Required Skills & Experience: Strong expertise in AWS serverless technologies , including large-scale Lambda function management. Extensive experience with AWS CDK and GitHub Actions for pipeline automation. Hands-on with AWS services: IAM, API Gateway, Lambda, SQS, SNS, S3, CloudWatch, RDS, EventBridge , and others. Proficient in TypeScript ; familiarity with Python is a plus. Solid understanding of CI/CD practices , infrastructure automation, and Git-based workflows . Experience building scalable and secure serverless systems in production.
Posted 1 month ago
8 - 13 years
20 - 30 Lacs
Gurgaon
Work from Office
Job Summary : We are seeking a highly skilled Tech Lead - Java with experience in Proof of Concept (PoC) development within a Center of Excellence (COE) environment. The ideal candidate will have a strong background in Java development, architecture, and solution prototyping, with a passion for exploring new technologies and implementing innovative solutions. Key Responsibilities : - Technical Leadership : Lead the design and development of Proof of Concept (PoC) projects within the COE. - Java Development : Architect, develop, and optimize Java-based applications and microservices. - Innovation & Research : Evaluate emerging technologies, frameworks, and tools to enhance existing systems and drive innovation. - Solution Prototyping : Develop rapid prototypes and demonstrate feasibility for new technical solutions. - Cloud & DevOps : Implement solutions using cloud platforms (AWS, Azure, GCP) and DevOps best practices (CI/CD, containerization). - Architecture & Design : Define scalable and secure application architectures, including API design and integration strategies. - Collaboration : Work with cross-functional teams, including architects, business stakeholders, and product managers, to align PoC initiatives with business goals. - Code Quality & Best Practices : Ensure high-quality coding standards, design patterns, and best practices in software development. - Mentorship & Knowledge Sharing : Guide junior developers, conduct technical training, and contribute to internal knowledge repositories. Required Skills & Qualifications : - Strong Java Expertise : 8+ years of experience in Java, Spring Boot, Hibernate, and Microservices. - PoC Development : Proven experience in developing Proof of Concepts and evaluating technical feasibility. - Cloud & DevOps : Hands-on experience with AWS/GCP, Kubernetes, Docker, and CI/CD pipelines. - Architecture & Design : Expertise in designing scalable, high-performance, and secure systems. - Modern Tech Stack : Experience with REST APIs, GraphQL, Kafka/RabbitMQ, and distributed computing. - Agile & CI/CD : Experience in Agile methodologies, test-driven development (TDD), and DevOps. - Strong Problem-Solving Skills : Ability to analyze complex problems and devise efficient solutions. Preferred Qualifications : - Experience working in a COE (Center of Excellence) environment. - Experience in performance tuning and optimizing Java applications. Interested candidate can call directly on 9910710604 or email at shilpee@monocept.com
Posted 2 months ago
8 - 13 years
15 - 30 Lacs
Chennai
Hybrid
Position Overview: We are seeking a highly skilled Data Engineer to join the Cloud Data Hub (CDH) Team, collaborating closely with teams in Munich and BMW TechWorks India. The ideal candidate will have expertise in Python, AWS, Terraform, and Git, with a passion for building scalable, reliable, and efficient data pipelines and solutions. As a Data Engineer, you will play a crucial role in designing, developing, and optimizing the data infrastructure that underpins BMWs transformation into a data-driven organization. About the project The Cloud Data Hub (CDH) is a cloud-based, centralized data lake developed by BMW, serving as the organization's central data landing zone. Designed to democratize data usage across all departments, the CDH consolidates data into a single source of truth, enabling providing and consuming entities to leverage data effectively and efficiently. It plays a pivotal role in BMW's transformation into a truly data-driven organization, supporting data acquisition, integration, processing, and analysis across its value chain. Key Responsibilities Design and implement scalable, reliable, and efficient data pipelines and ETL/ELT processes using Python and AWS. Build and maintain cloud-native data infrastructure on AWS, leveraging services such as Lambda, S3, and Glue. Develop and manage infrastructure as code (IaC) using Terraform to ensure repeatable, automated deployment of cloud resources. Utilize Git for version control and collaborate with the team through code reviews and CI/CD pipelines. Apply Test-Driven Development (TDD) principles to ensure robust, maintainable, and high-quality data pipelines. Monitor and troubleshoot data pipelines and infrastructure, ensuring high availability and reliability. Ensure compliance with data governance, security policies, and BMW’s best practices. Document technical processes, data models, and architecture to ensure transparency and ease of maintenance. Qualifications Expert-level proficiency in building and managing data pipelines with Python. Strong experience in AWS cloud services, including Lambda, S3, Glue, and other data-focused services. Advanced skills in Terraform for provisioning and managing infrastructure-as-code on AWS. Proficiency in SQL for querying and modeling large-scale datasets. Hands-on experience with Git for version control and managing collaborative workflows. Familiarity with ETL/ELT processes and tools for data transformation. Strong understanding of data architecture, data modeling, and data lifecycle management. Excellent problem-solving and debugging skills. Strong communication and collaboration skills to work effectively in a global, distributed team environment
Posted 2 months ago
14 - 18 years
37 - 50 Lacs
Bengaluru
Work from Office
Responsibilities Under limited supervision, general direction, etc. and in accordance with all applicable government laws, regulations and ASP policies, procedures and guidelines, this position: Design, develop, and implement software for our products and systems. Collaborate with cross-functional teams to define, design, and implement new software features. Debug and resolve software defects and issues. Conduct performance analysis and optimization of the software systems. Review code and design and provide constructive feedback to team members. Lead and mentor junior engineers to promote knowledge growth and ensure project deliverables. Interface among multiple departments and teams including Quality and Service etc. Identify and recommend opportunities for efficiency improvements in department processes. Conduct static analysis, code coverage analysis, and other verification techniques to ensure high-quality software. Responsible for communicating business related issues or opportunities to next management level. Performs other duties assigned as needed. Requirements Bachelors degree in Computer Engineering/Science or other relevant technical experience is required. 12+years of experience in REST API based software development 8+ years of working experience in React JS,Angular JS, MS SQL, PostgreSQL, RabbitMQ Proficiency in of Java Spring Boot, AWS (EC2, ECS, S3, IoT, Load balancer) Proficiency in shell scripting Experience in Docker Containerization and orchestration Experience in Linux is a preferred. Working knowledge of real-time operating systems and hardware is preferred. Experience with software development tools, such as IDEs, debuggers, and version control systems Strong understanding of data structures, algorithms, software architectures, and design principles. Familiarity with software configuration management tools, defect tracking tools, and peer review techniques. Knowledge of defect management tools such as JIRA is required. Prior work in a regulated environment like medical device industry is preferred. Familiarity to Agile methodology is preferred. Has good verbal and written communication skills.
Posted 2 months ago
6 - 11 years
20 - 35 Lacs
Bengaluru
Hybrid
We are looking for a Cloud DevOps Engineer to build, automate, and scale our product, which is a multi-cloud, event-driven AI system for real-time anomaly detection and alerting. You will be responsible for cloud infrastructure, Kubernetes, automation, CI/CD, and observability, ensuring high availability and performance. While this is a DevOps-heavy role, knowledge of event streaming, real-time data processing, and ML pipeline automation will be a plus. Key Responsibilities Building, automating, and maintaining robust and scalable cloud infrastructure, with a strong emphasis on automation, security, and observability. 1. Infrastructure & Cloud Automation: Multi-Cloud Management: Design, implement, and manage infrastructure across various cloud platforms (e.g., AWS, Azure, GCP) and on-premises environments. IAC Implementation: Utilize Infrastructure-as-Code (IaC) principles to automate infrastructure provisioning and management using Terraform or Pulumi. Configuration Management: Automate system configuration and application deployments using Ansible (or similar configuration management tools) to ensure consistency and reliability. Kubernetes Optimization: Optimize the performance and cost-effectiveness of both cloud-based (EKS, AKS, GKE) or self-hosted Kubernetes clusters. 2. CI/CD & Deployment Automation: Design and implement GitOps workflows using tools like Jenkins, and GitHub Actions to streamline application deployments. Containerized Deployment: Automate the deployment of containerized applications across various Kubernetes distributions (EKS, AKS, GKE). API Management: Manage API gateways, service discovery, and ingress controllers using technologies like Nginx. 3. Event-Driven & Real-Time Processing (Supporting Role): Real-Time Data Streaming: Deploy, manage, and maintain Kafka-based solutions (Apache Kafka, Confluent, MSK, Event Hubs, Pub/Sub) for real-time data ingestion and processing. Database Optimization: Optimize the performance and scalability of various open-source and cloud-based databases (PostgreSQL, TimescaleDB, Elasticsearch, Influx DB etc). MLOps Support: Support Machine Learning (ML) workloads by integrating Kubeflow, MLflow, Ray, Seldon, Sage maker, Vertex AI, and Azure ML with CI/CD pipelines. 4. Observability & Security: Monitoring & Logging: Implement comprehensive monitoring and logging solutions using tools like Prometheus, Grafana, Open Telemetry, ELK, Loki, and Datadog to proactively identify and resolve issues. Security Implementation: Implement robust security measures, including RBAC, IAM, and Secrets Management (Vault, Sealed Secrets, AWS KMS, Azure Key Vault) to protect sensitive data and infrastructure. Compliance & Best Practices: Ensure adherence to SOC2, GDPR, and other relevant compliance standards, while implementing security best practices across all cloud and open-source technologies. Skills & Qualifications: Must-Have: Strong expertise in Cloud DevOps (AWS, Azure, GCP) and open-source infrastructure. Experience with Terraform, Kubernetes (EKS, AKS, GKE, OpenShift, K3s), Docker and CI/CD automation. Hands-on experience with Kafka, RabbitMQ, or event-driven microservices. Observability tools (Prometheus, Open Telemetry, Grafana, ELK). Strong Python or scripting for automation. Good-to-Have (Data & AI Integration): Knowledge of real-time data pipelines (Kafka, CDC, Flink, Spark Streaming, Debezium). Experience integrating MLOps tools (Kubeflow, MLflow, DASK, Ray, Sage maker, Vertex AI, Azure ML). Exposure to serverless computing (Lambda, Cloud Functions, Azure Functions).
Posted 3 months ago
10 - 17 years
15 - 30 Lacs
Bengaluru, Hyderabad
Work from Office
Required Skills: DevOps: IaC, Ansible, Kubernetes/OpenShift Cloud: IaC, EKS/AWS, RH ACM/Ansible K8S: Administration and experience working on-prem Clusters Role & responsibilities At the direction of lead architects, develop and implement technical efforts to design, build, and deploy cloud infrastructure with the help of network and security services. Develop and maintain automation scripts for cloud deployments. Making applications and architecting databases over the cloud. Managing cloud data storage. Establishing a secure cloud environment. Ensuring proper availability of the services. Monitor and optimize cloud infrastructure performance. Troubleshoot incidents, identify root causes, fix, and document problems, and implement preventive measures. Developing strategies for disaster management and recovery. Demonstrate exceptional problem-solving skills, with an ability to see and solve issues before they affect business productivity. Technical Skills Python Go Ansible Terraform Packer Kubernetes Docker CI/CD Pipelines (Github Actions, BitBucket, Azure DevOps, TeamCity, AWS) AWS GCP Linux (RHEL and Ubuntu)
Posted 3 months ago
6 - 11 years
19 - 34 Lacs
Bengaluru, Hyderabad, Kolkata
Work from Office
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of an AWS Developer! We are looking for candidates who have a passion for cloud with knowledge of different cloud environments. Ideal candidates should have technical experience in AWS Platform Services – IAM Role & Policies, Glue, Lamba, EC2, S3, SNS, SQS, EKS, KMS, etc. This key role demands a highly motivated individual with a strong background in Computer Science/ Software Engineering. You are meticulous, thorough and possess excellent communication skills to engage with all levels of our stakeholders. A self-starter, you are up-to-speed with the latest developments in the tech world. Responsibilities Hands-On experience & good skills on AWS Platform Services – IAM Role & Policies, Glue, Lamba, EC2, S3, SNS, SQS, EKS, KMS, etc. Must have good working knowledge on Kubernetes & Dockers. Utilize AWS services such as Amazon Glue, Amazon S3, AWS Lambda, and others to optimize performance, reliability, and cost-effectiveness. Develop scripts, utilities, and automation tools to facilitate the migration process and ensure compatibility with AWS services. Implement best practices for security, scalability, and fault tolerance in AWS-based solutions. Experience in AWS Cost Analysis & thorough understanding to optimize AWS Cost. Must have good working knowledge on deployment templates like Terraform\Cloud formation. Ability to multi-task and manage various project elements simultaneously. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor’s Degree with experience in Information Technology. Must have experience in AWS Platform Services. Preferred Qualifications/ Skills Very good written and presentation / verbal communication skills with experience of customer interfacing role. In-depth requirement understanding skills with good analytical and problem-solving ability, interpersonal efficiency, and positive attitude. Experience in ML/ AI Experience in the telecommunication industry Experience with cloud providers (e.g., AWS, GCP) Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 months ago
10 - 15 years
35 - 45 Lacs
Pune
Hybrid
About Company A leading cybersecurity organization specializes in securing and ensuring compliance for Generative AI (Gen AI) applications. It provides innovative solutions for AI security, offering visibility, policy enforcement, and risk mitigation. With a strong focus on data protection and governance, the company helps businesses adopt Gen AI securely while meeting compliance standards like the EU AI Act. Backed by industry experts, it collaborates with global leaders to drive AI security innovation. Looking for a Principal Python Engineer to lead the design and development of cloud-native backend solutions. This role offers an exciting opportunity to work in a dynamic, agile environment on complex enterprise-grade products. Location: Pune (Hybrid Model ) Experience: 12+ Years Key Responsibilities: Develop and maintain Python-based microservices for data ingestion, querying, and LLM processing Design and implement clean, efficient data models and RESTful APIs Write high-quality, testable code following best practices Collaborate with cross-functional teams (DevOps, QA, Product) for seamless integration Improve system architecture and performance using modern tools and frameworks Required Qualifications: BE in Computer Science from a reputed institution 12+ years of software development experience, with 8+ years in Python backend development (asyncio, FastAPI, Celery) Expert-level knowledge of PostgreSQL, MongoDB, and/or Vector DB Experience in scalable, highly available enterprise-grade products Hands-on experience with AWS cloud services (Lambda, EC2, S3, RDS) Strong expertise in software design patterns and architecture Excellent problem-solving and communication skills Preferred Qualifications: Basic understanding of Generative AI and Machine Learning concepts Familiarity with CI/CD pipelines and containerization technologies (Docker, Kubernetes) Why Consider This Opportunity? Work on cutting-edge AI-driven data governance solutions Join an innovative startup with a fast-paced and collaborative culture Engage with top engineers in a dynamic, agile environment Competitive compensation and benefits package
Posted 3 months ago
4 - 9 years
11 - 20 Lacs
Bengaluru
Remote
SRE profile with strong AWS expertise (CDK, Serverless architecture, lambda) Extensive monitoring tools experience (Grafana, Prometheus, ELK)
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2