Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
4 - 8 Lacs
Mumbai, Bengaluru, Delhi / NCR
Work from Office
Expected Notice Period: 15 Days Shift: (GMT+05:30) Asia/Kolkata (IST) Opportunity Type: Remote What do you need for this opportunity Must have skills required: Cloud Infrastructure, Unit Testing, Micro services, Node Js, AWS, Mongo DB, Type Script A Series B Funded Innovative Device Trade-In Company - Netherlands is Looking for: About the role As an ideal candidate, you must be a problem solver with solid experience and knowledge in Node.js TypeScript. Youll be the brain behind crafting, developing, testing, going live and maintaining the system. You must be passionate in understanding the business context for features built to drive better customer experience and adoption. Our tech stack Node.js, TypeScript, MongoDB, AWS, AWS SQS, Microservices, and Kubernetes Requirements 1. At least 4 years of experience with Node.js TypeScript 2. In-depth knowledge of microservices architecture and unit testing 3. A deep understanding of the Node.js Event Loop 4. Expertise in document-oriented databases, especially MongoDB 5. Experience in designing, building, and scaling back-end systems on cloud infrastructure 6. Strong commitment to improving product experience and user satisfaction Responsibilities 1. Consistently write high-quality, efficient code 2. Develop and maintain a comprehensive suite of automated tests, including unit, integration, E2E, and functional tests 3. Perform code reviews and ensure adherence to design patterns and the organization's coding standards 4. Mentor junior developers, contributing to their technical growth 5. Collaborate with product and design teams to build user-focused solutions 6. Identify, prioritize, and execute tasks in the software development life cycle 7. Develop tools and applications by producing clean, efficient code 8. Troubleshoot, debug, and upgrade existing software 9. Recommend and execute improvements 10 . Collaborate with multidisciplinary teams to understand requirements and develop new solutions.
Posted 1 month ago
7.0 - 11.0 years
9 - 12 Lacs
Mumbai, Bengaluru, Delhi
Work from Office
Experience : 7.00 + years Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Must have skills required: DevOps, PowerShell, CLI, Amazon AWS, Java, Scala, Go (Golang), Terraform Opportunity Summary: We are looking for an enthusiastic and dynamic individual to join Upland India as a DevOps Engineer in the Cloud Operations Team. The individual will manage and monitor our extensive set of cloud applications. The successful candidate will possess extensive experience with production systems with an excellent understanding of key SaaS technologies as well as exhibit a high amount of initiative and responsibility. The candidate will participate in technical/architectural discussions supporting Uplands product and influence decisions concerning solutions and techniques within their discipline. What would you do Be an engaged, active member of the team, contributing to driving greater efficiency and optimization across our environments. Automate manual tasks to improve performance and reliability. Build, install, and configure servers in physical and virtual environments. Participate in an on-call rotation to support customer-facing application environments. Monitor and optimize system performance, taking proactive measures to prevent issues and reactive measures to correct them. Participate in the Incident, Change, Problem, and Project Management programs and document details within prescribed guidelines. Advise technical and business teams on tactical and strategic improvements to enhance operational capabilities. Create and maintain documentation of enterprise infrastructure topology and system configurations. Serve as an escalation for internal support staff to resolve issues. What are we looking for Experience: Overall, 7-9 years total experience in DevOps: AWS (solutioning and operations), GitHub/Bitbucket, CI/CD, Jenkins, ArgoCD, Grafana, Prometheus, etc. Technical Skills To be a part of this journey, you should have 7-9 years of overall industry experience managing production systems, an excellent understanding of key SaaS technologies, and a high level of initiative and responsibility. The following skills are needed for this role. Primary Skills: Public Cloud Providers: AWS: Solutioning, introducing new services in existing infrastructure, and maintaining the infrastructure in a production 24x7 SaaS solution. Administer complex Linux-based web hosting configuration components, including load balancers, web, and database servers. Develop and maintain CI/CD pipelines using GitHub Actions, ArgoCD, and Jenkins. EKS/Kubernetes, ECS, Docker Administration/Deployment. Strong knowledge of AWS networking concepts including: Route53, VPC configuration and management, DHCP, VLANs, HTTP/HTTPS and IPSec/SSL VPNs. Strong knowledge of AWS Security concepts: AWS: IAM accounts, KMS managed encryption, CloudTrail, CloudWatch monitoring/alerting. Automating existing manual workload like reporting, patching/updating servers by writing scripts, lambda functions, etc. Expertise in Infrastructure as Code technologies: Terraform is a must. Monitoring and alerting tools like Prometheus, Grafana, PagerDuty, etc. Expertise in Windows and Linux OS is a must. Secondary Skills: It would be advantageous if the candidate also has the following secondary skills: Strong knowledge of scripting/coding with Go, PowerShell, Bash, or Python . Soft Skills: Strong written and verbal communication skills directed to technical and non-technical team members. Willingness to take ownership of problems and seek solutions. Ability to apply creative problem solving and manage through ambiguity. Ability to work under remote supervision and with a minimum of direct oversight. Qualification Bachelors degree in computer science, Engineering, or a related field. Proven experience as a DevOps Engineer with a focus on AWS. Experience with modernizing legacy applications and improving deployment processes. Excellent problem-solving skills and the ability to work under remote supervision. Strong written and verbal communication skills, with the ability to articulate technical information to non-technical team members.
Posted 1 month ago
8.0 - 12.0 years
22 - 27 Lacs
Hyderabad, Ahmedabad, Gurugram
Work from Office
About the Role: Grade Level (for internal use): 12 The Team As a member of the EDO, Collection Platforms & AI Cognitive Engineering team you will spearhead the design and delivery of robust, scalable ML infrastructure and pipelines that power natural language understanding, data extraction, information retrieval, and data sourcing solutions for S&P Global. You will define AI/ML engineering best practices, mentor fellow engineers and data scientists, and drive production-ready AI products from ideation through deployment. Youll thrive in a (truly) global team that values thoughtful risk-taking and self-initiative. Whats in it for you Be part of a global company and build solutions at enterprise scale Lead and grow a technically strong ML engineering function Collaborate on and solve high-complexity, high-impact problems Shape the engineering roadmap for emerging AI/ML capabilities (including GenAI integrations) Key Responsibilities Architect, develop, and maintain production-ready data acquisition, transformation, and ML pipelines (batch & streaming) Serve as a hands-on lead-writing code, conducting reviews, and troubleshooting to extend and operate our data platforms Apply best practices in data modeling, ETL design, and pipeline orchestration using cloud-native solutions Establish CI/CD and MLOps workflows for model training, validation, deployment, monitoring, and rollback Integrate GenAI components-LLM inference endpoints, embedding stores, prompt services-into broader ML systems Mentor and guide engineers and data scientists; foster a culture of craftsmanship and continuous improvement Collaborate with cross-functional stakeholders (Data Science, Product, IT) to align on requirements, timelines, and SLAs What Were Looking For 8-12 years' professional software engineering experience with a strong MLOps focus Expert in Python and Apache for large-scale data processing Deep experience deploying and operating ML pipelines on AWS or GCP Hands-on proficiency with container/orchestration tooling Solid understanding of the full ML model lifecycle and CI/CD principles Skilled in streaming and batch ETL design (e.g., Airflow, Dataflow) Strong OOP design patterns, Test-Driven Development, and enterprise system architecture Advanced SQL skills (big-data variants a plus) and comfort with Linux/bash toolsets Familiarity with version control (Git, GitHub, or Azure DevOps) and code review processes Excellent problem-solving, debugging, and performance-tuning abilities Ability to communicate technical change clearly to non-technical audiences Nice to have Redis, Celery, SQS and Lambda based event driven pipelines Prior work integrating LLM services (OpenAI, Anthropic, etc.) at scale Experience with Apache Avro and Apache Familiarity with Java and/or .NET Core (C#) Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH103.2 - Middle Management Tier II (EEO Job Group)
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Chennai
Work from Office
Job Summary Synechron is seeking a skilled and committed Full Stack Developer specialized in Node.js and React.js to join our innovative technology team. In this role, you will be pivotal in designing, developing, and deploying scalable, high-performance enterprise web applications. Your expertise will drive the creation of responsive interfaces and robust backend services, supporting business objectives across banking, retail, or related domains. This position offers an opportunity to work on cutting-edge technologies and collaborate with cross-functional teams to deliver impactful digital solutions. Software Required Software Skills: React.js (TypeScript preferred) Node.js (v14+ recommended) JavaScript & TypeScript (ES6+) HTML5, CSS3, and CSS pre-processors (SASS, LESS) or CSS-in-JS (Styled Components, Emotion) NoSQL databases, primarily MongoDB Version controlGit, Bitbucket or GitHub CI/CD tools such as Jenkins, GitLab CI, or equivalent ContainerizationDocker, Kubernetes API Gateway platforms (e.g., 3Scale or similar) Message queuesKafka, AWS SQS, or similar Cloud platformsAWS, Microsoft Azure (basic understanding) Testing frameworksJest, Enzyme, Jasmine Code quality toolsESLint, Prettier, TSLINT Preferred Software Skills: GraphQL Advanced monitoring tools and performance profiling tools Experience with cloud-native development and serverless architectures Overall Responsibilities Develop, test, and maintain scalable, reusable, and high-performance enterprise web applications using React.js and Node.js. Architect and implement RESTful APIs, Microservices, and server-side logic to meet functional and technical specifications. Collaborate with cross-functional teams including product managers, UI/UX designers, and QA to deliver seamless user experiences. Automate build, deployment pipelines, and infrastructure provisioning to ensure efficient production releases. Monitor application and front-end performance, optimize user interfaces, and troubleshoot issues proactively. Implement security best practices such as OAuth, JWT, and secure API design. Stay current with emerging technologies and industry best practices, incorporating improvements into projects. Document code, architecture decisions, and system workflows to maintain clarity and facilitate team knowledge sharing. Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives) and contribute to continuous process improvements. Technical Skills (By Category) Programming Languages: EssentialJavaScript, TypeScript PreferredES6+ features, multi-threading techniques Databases/Data Management: EssentialMongoDB, NoSQL principles PreferredData modeling, indexing, and performance tuning Cloud & Infrastructure: EssentialContainerization with Docker, orchestration with Kubernetes PreferredCloud services (AWS, Azure), serverless frameworks Frameworks & Libraries: EssentialReact.js (hooks, context API), Node.js, Express.js PreferredRedux, Flux, GraphQL, Tailwind CSS, Material UI, Styled Components Development Tools & Methodologies: EssentialCI/CD pipelines, version control, automated testing PreferredMonitoring dashboards, performance profiling tools Security Protocols: EssentialAuthentication & Authorization (OAuth, JWT), secure API design PreferredAPI Gateway security integrations Experience Minimum 5 years of professional experience in full-stack development with React.js and Node.js Proven success in building responsive and enterprise-grade web applications Experience designing and consuming RESTful APIs and Microservices architectures Hands-on experience with NoSQL databases, especially MongoDB Familiarity with cloud deployment, container orchestration, and CI/CD pipelines Knowledge of banking, retail, or financial services domains is a plus Experience with performance tuning, debugging, and troubleshooting complex distributed systems Day-to-Day Activities Develop user interfaces and backend services aligned with project specifications Participate in Agile sprint planning, daily stand-ups, and code reviews Automate deployment processes, monitor application health, and optimize performance Troubleshoot and resolve technical issues promptly Collaborate with stakeholders to gather requirements and convert them into technical solutions Continuously review code for quality, security, and efficiency Document technical workflows, design decisions, and API specifications Stay updated on industry trends to incorporate best practices into development processes Qualifications Educational : Bachelors or Masters degree in Computer Science, Software Engineering, or related field Equivalent professional experience in relevant roles Certifications (Preferred): AWS Certified Developer, Kubernetes certifications, or similar Training & Development: Ongoing learning in relevant tools, frameworks, and industry standards Professional Competencies Strong analytical and problem-solving skills with attention to detail Ability to work autonomously and own deliverables, demonstrating initiative Effective collaboration across interdisciplinary teams Excellent communication skills for technical and non-technical audiences Flexibility to adapt to evolving technologies and project requirements Commitment to quality, security compliance, and continuous improvement Time management skills to prioritize tasks effectively S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 1 month ago
3.0 - 6.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Title Cloud Engineer for Data Platform group Department Enterprise Engineering Location Bengaluru Level 5 Technical Consultant Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our Data Platform team and feel like you are part of something bigger. About your team The Data Platform team manage the products and technical infrastructure that underpin the use of data at Fidelity - Databases (Oracle, SQL Server, PostgreSQL), data streaming (Kafka, Snaplogic), data security, data lake (Snowflake), data analytics (PowerBI, Oracle Analytics Cloud), data management, and more. We provide both cloud-based and on-premises solutions as well as automation and self-service tools. The company predominately uses AWS, but we also deploy on Azure and Oracle Cloud in certain situations. About your role You role will be to use your skills (and the many skills you will acquire on the job) for developing Infrastructure as Code (IaC) solutions using Terraform and Python, with a strong focus on AWS and Azure environments. A significant aspect of this role involves creating and maintaining Terraform modules to standardise and optimise infrastructure provisioning across these platforms, with particular emphasis on the database components of the infrastructure. You will also develop CI/CD automation processes to enhance operational efficiency and reduce manual intervention, ensuring robust, scalable, and secure cloud infrastructure management. About you You will be a motivated, curious and technically savy person who is always collaborative and keeps the customer in mind with the work you perform. Required skills are: A strong development & infrastructure engineering background with hands-on experience in Infrastructure as Code solutions across multiple providers, focusing on delivering automated, scalable, and resilient infrastructure management. Proven practical experience in implementing effective automation solutions that meet infrastructure requirements, along with the ability to identify risks with mitigating actions. Practical experience of implementing simple & effective cloud and/or database solutions Strong working knowledge of fundamental AWS concepts, such as IAM, networking, security, compute (Lambda, EC2), S3, SQS/SNS, scheduling tools Python, Bash and SQL programming (PowerShell a bonus) Oracle or PostgreSQL database knowledge a bonus Experience of delivering change through CI/CD using Terraform (Github Actions a bonus) Ability to work on tasks as a team player using Kanban Agile methodology, share knowledge and deal effectively with people from other company departments Transparency of work with others to ensure maximum knowledge transfer & collaboration across global teams. Highly motivated team player forming strong relationships with colleagues with excellent interpersonal skills. Excellent verbal & written communication in English
Posted 1 month ago
5.0 - 9.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Python Developer with 5 to 9 years of experience to design, develop, and maintain serverless applications using Python and AWS technologies. The ideal candidate will have extensive experience in building scalable, high-performance back-end systems and a deep understanding of AWS serverless services such as Lambda, DynamoDB, SNS, SQS, S3, and others. This role is based in Bangalore and Mumbai. Roles and Responsibility Design and implement robust, scalable, and secure back-end services using Python and AWS serverless technologies. Build and maintain serverless applications leveraging AWS Lambda, DynamoDB, API Gateway, S3, SNS, SQS, and other AWS services. Provide technical leadership and mentorship to a team of engineers, promoting best practices in software development, testing, and DevOps. Collaborate closely with cross-functional teams including front-end developers, product managers, and DevOps engineers to deliver high-quality solutions that meet business needs. Implement and manage CI/CD pipelines, automated testing, and monitoring to ensure high availability and rapid deployment of services. Optimize back-end services for performance, scalability, and cost-effectiveness, ensuring the efficient use of AWS resources. Ensure all solutions adhere to industry best practices for security, including data protection, access controls, and encryption. Create and maintain comprehensive technical documentation, including architecture diagrams, API documentation, and deployment guides. Diagnose and resolve complex technical issues in production environments, ensuring minimal downtime and disruption. Stay updated with the latest trends and best practices in Python, AWS serverless technologies, and fintech/banking technology stacks, and apply this knowledge to improve our systems. Job Minimum 7 years of experience in back-end software development, with at least 5 years of hands-on experience in Python. Extensive experience with AWS serverless technologies, including Lambda, DynamoDB, API Gateway, S3, SNS, SQS, S3, ECS, EKS, and other related services. Proven experience in leading technical teams and delivering complex, scalable cloud-based solutions in the fintech or banking sectors. Strong proficiency in Python and related frameworks (e.g., Flask, Django). Deep understanding of AWS serverless architecture and best practices. Experience with infrastructure as code (IaC) tools such as AWS CloudFormation or Terraform. Familiarity with RESTful APIs, microservices architecture, and event-driven systems. Knowledge of DevOps practices, including CI/CD pipelines, automated testing, and monitoring using AWS services (e.g., CodePipeline, CloudWatch, X-Ray). Demonstrated ability to lead and mentor engineering teams, fostering a culture of collaboration, innovation, and continuous improvement. Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve complex technical issues in a fast-paced environment. Excellent verbal and written communication skills, with the ability to effectively convey technical concepts to both technical and non-technical stakeholders. Experience with other cloud platforms (e.g., Azure, GCP) and containerization technologies like Docker and Kubernetes. Familiarity with financial services industry regulations and compliance requirements. Relevant certifications such as AWS Certified Solutions Architect, AWS Certified Developer, or similar.
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled MLOps professional with 3 to 11 years of experience to join our team in Hyderabad. The ideal candidate will have a strong background in Machine Learning, Artificial Intelligence, and Computer Vision. Roles and Responsibility Design, build, and maintain efficient, reusable, and tested code in Python and other applicable languages and library tools. Understand stakeholder needs and convey them to developers. Work on automating and improving development and release processes. Deploy Machine Learning (ML) to large production environments. Drive continuous learning in AI and computer vision. Test and examine code written by others and analyze results. Identify technical problems and develop software updates and fixes. Collaborate with software developers and engineers to ensure development follows established processes and works as intended. Plan out projects and participate in project management decisions. Job Minimum 3 years of hands-on experience with AWS services and products (Batch, SageMaker, StepFunctions, CloudFormation/CDK). Strong Python experience. Minimum 3 years of experience with Machine Learning/AI or Computer Vision development/engineering. Ability to provide technical leadership to developers for designing and securing solutions. Understanding of Linux utilities and Bash. Familiarity with containerization using Docker. Experience with data pipeline frameworks, such as MetaFlow is preferred. Experience with Lambda, SQS, ALB/NLBs, SNS, and S3 is preferred. Practical experience deploying Computer Vision/Machine Learning solutions at scale into production. Exposure to technologies/tools such as Keras, Pandas, TensorFlow, PyTorch, Caffe, NumPy, DVC/CML.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 10 , jd= Job Title:- Senior .Net Developer Job Location:- Hyderabad, Pune Job Type:- Full Time JD:- Required Qualifications — 5+ years of professional software development experience. — Post-secondary degree in computer science, software engineering or related discipline, or equivalent working experience. — Development of distributed applications with Microsoft technologiesC# .NET/Core, SQL Server, Entity Framework. — Deep expertise with microservices architectures and design patterns. — Cloud Native AWS experience with services such as Lambda, SQS, RDS/Aurora, S3, Lex, and Polly. — Mastery of both Windows and Linux environments and their use in the development and management of complex distributed systems architectures. — Git source code repository and continuous integration tools. — Proficient with debugging and profiling distributed systems. — Practiced UT and System Integration Test, with an agile and test-driven development mindset Preferred Qualifications — Strong programming experience in languages/frameworks outside of .NET, such as Java and Python. — Experience with additional database engines (MySQL, PostgreSQL) and languages(PL/SQL). , Title=Senior .Net Developer, ref=6566512
Posted 1 month ago
8.0 - 13.0 years
2 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Senior Python Developer Job Type: Full-time, Contractor About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: We are seeking a highly skilled Senior Python Developer to join one of our top customers , committed to designing and implementing high-performance microservices The ideal candidate will have extensive experience with Python, FastAPI, task queues, web sockets and Kubernetes to build scalable solutions for our platforms This is an exciting opportunity for those who thrive in challenging environments and have a passion for technology and innovation Key Responsibilities: Design and develop backend services using Python, with an emphasis on FastAPI for high-performance applications Architect and orchestrate microservices to handle high concurrency I/O requests efficiently Deploy and manage applications on AWS, ensuring robust and scalable solutions are delivered Implement and maintain messaging queues using Celery, RabbitMQ, or AWS SQS Utilize WebSockets and asynchronous programming to enhance system responsiveness and performance Collaborate with cross-functional teams to ensure seamless integration of solutions Continuously improve system reliability, scalability, and performance through innovative design and testing Required Skills and Qualifications: Proven experience in production deployments with user bases exceeding 10k Expertise in Python and FastAPI, with strong knowledge of microservices architecture Proficiency in working with queues and asynchronous programming Hands-on experience with databases such as Postgres, MongoDB, or Databricks Comprehensive knowledge of Kubernetes for running scalable microservices Exceptional written and verbal communication skills Consistent work history without overlapping roles or career gaps Preferred Qualifications: Experience with GoLang for microservice development Familiarity with data lake technologies such as Iceberg Understanding of deploying APIs in Kubernetes environments
Posted 1 month ago
2.0 - 5.0 years
5 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Senior Back End Engineer Job Type: Full-time, Contractor About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary: We are seeking a highly skilled Senior Back End Engineer to join one of our top customers , committed to designing and implementing high-performance microservices The ideal candidate will have extensive experience with Python, FastAPI, task queues, web sockets and Kubernetes to build scalable solutions for our platforms This is an exciting opportunity for those who thrive in challenging environments and have a passion for technology and innovation Key Responsibilities: Design and develop backend services using Python, with an emphasis on FastAPI for high-performance applications Architect and orchestrate microservices to handle high concurrency I/O requests efficiently Deploy and manage applications on AWS, ensuring robust and scalable solutions are delivered Implement and maintain messaging queues using Celery, RabbitMQ, or AWS SQS Utilize WebSockets and asynchronous programming to enhance system responsiveness and performance Collaborate with cross-functional teams to ensure seamless integration of solutions Continuously improve system reliability, scalability, and performance through innovative design and testing Required Skills and Qualifications: Proven experience in production deployments with user bases exceeding 10k Expertise in Python and FastAPI, with strong knowledge of microservices architecture Proficiency in working with queues and asynchronous programming Hands-on experience with databases such as Postgres, MongoDB, or Databricks Comprehensive knowledge of Kubernetes for running scalable microservices Exceptional written and verbal communication skills Consistent work history without overlapping roles or career gaps Preferred Qualifications: Experience with GoLang for microservice development Familiarity with data lake technologies such as Iceberg Understanding of deploying APIs in Kubernetes environments
Posted 1 month ago
8.0 - 10.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 4 - 8 years Location : Bangalore The Data Engineer will contribute to building state-of-the-art data Lakehouse platforms in AWS, leveraging Python and Spark. You will be part of a dynamic team, building innovative and scalable data solutions in a supportive and hybrid work environment. You will design, implement, and optimize workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires previous experience of building data products using AWS services, familiarity with Python and Spark, problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable previous experience as a data engineer. Technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Proficient in Python and Spark, with a focus on ETL data processing and data engineering practices. Nice To Have Tech Skills: Familiar with data services in a Lakehouse architecture. Familiar with technical design practices, allowing for the creation of scalable, reliable data products that meet both technical and business requirements A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Writes high quality code, ensuring solutions meet business requirements and technical standards. Works with architects, Product Owners, and Development leads to decompose solutions into Epics, assisting the design and planning of these components. Creates clear, comprehensive technical documentation that supports knowledge sharing and compliance. Experience in decomposing solutions into components (Epics, stories) to streamline development. Actively contributes to technical discussions, supporting a culture of continuous learning and innovation. Key Skills: Proficient in Python and familiar with a variety of development technologies. Previous experience of implementing data pipelines, including use of ETL tools to streamline data ingestion, transformation, and loading. Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS, SQS, API Gateway, Athena. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Experienced in Agile development, including sprint planning, reviews, and retrospectives Educational Background: Bachelor’s degree in computer science, Software Engineering, or related essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Familiar with implementing and optimizing CI/CD pipelines. Understands the processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.
Posted 1 month ago
4.0 - 7.0 years
7 - 12 Lacs
Indore, Pune, Chennai
Work from Office
What will your role look like Develop Microservices using Spring /AWS technologies and deploy on AWS platform Support Java Angular enterprise applications with multi-region setup. Perform unit and system testing of application code as well as execution of implementation activities Design, build and test Java EE and Angular full stack applications Why you will love this role Besides a competitive package, an open workspace full of smart and pragmatic team members, with ever-growing opportunities for professional and personal growth Be a part of a learning culture where teamwork and collaboration are encouraged, diversity is valued and excellence, compassion, openness and ownership is rewarded We would like you to bring along In-depth knowledge of popular Java frameworks like Spring boot and Spring Experience with Object-Oriented Design (OOD) Experience in Spring, Spring Boot and Excellent knowledge of Relational Databases, MySQL, and ORM technologies (JPA2, Hibernate) Experience working in Agile (Scrum/Lean) with DevSecOps focused Experience with AWS, Kubernetes, Docker Containers AWS Component Usage, Configurations and Deployment Elasticsearch, EC2, S3, SNS, SQS, API Gateway Service, Kinesis AWS certification would be advantageous Knowledge of Health and related technologies
Posted 1 month ago
4.0 - 9.0 years
7 - 12 Lacs
Gurugram
Work from Office
Node Developer ISS Delivery team needs a highly motivated/ selfdriven Analyst Programmer/Senior Analyst programmer to work on multiple new projects and modifications. The candidates will be responsible to deliver feasible solution/design in consultation with technical team and work as a developer on assigned projects or product teams. The successful candidate will be expected to bridge the requirement gap with business and carry out impact analysis on complete integrated system. In addition, candidates would be expected to complete development as per the project and/or sprint plans. Job Responsibilities Seasoned IT software delivery professional with an experience of 4+ years of hands on development. Expert at following technology sets Proficiency in Java, JEE, and Node.js platforms. Experience with containers, specifically NestJS and Spring Boot. Strong understanding of Spring, Apache, TypeORM, and Spring JPA libraries. High proficiency with development environments and tools such as IntelliJ IDEA or Visual Studio Code (VSCode). Familiarity with API development and testing tools including Swagger, REST, GraphQL, Insomnia, and Postman. Experience with messaging technologies like IBM MQ, ActiveMQ, RabbitMQ, AWS SQS, and Apache Kafka. Skilled in using testing frameworks like JUnit and Jest, and mocking frameworks like EasyMock. Familiarity with AWS technologies. Experience with agile / continuous delivery paradigms of software development and with extreme programming (XP) techniques is a key. Should be an expert craftsman on test first model of Test-Driven Development (TDD). Expert at putting in continuous integration and build/release automation practices. Experience on at least one build server like Bamboo, Teamcity Has ability to envision design for complex functional and technical problems while incrementally evolving the implementation in simple progressive steps of deliveries. You are adept at Pair programming and can comfortably pair with other developers irrespective of maturity/experience levels. You are eager to coach/guide juniors to improve the delivery capability. You are at ease with understanding/supporting the code base written by others and in learning/trying wider technology sets / problem solving methods used in a project. Aware and experienced with code and language/technology best practices. Mandatory Skills Java, JEE, and Node.js platforms. Spring, Apache, TypeORM, and Spring JPA IntelliJ IDEA or Visual Studio Code (VSCode). Swagger, REST, GraphQL, Insomnia, and Postman. IBM MQ, ActiveMQ, RabbitMQ, AWS SQS, and Apache Kafka. Familiarity with AWS Secondary Skills specifically NestJS and Spring Boot Diagnostic and analytical skills Good communication skills Display a flexible attitude Strong troubleshooting skills B.E./B.Tech. Or M.C.A. in Computer Science with 8 + years of Experaince
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
8 to 12 years of experience in information technology with an emphasis on application development, demonstrated experience with applications development throughout the entire development lifecycle. In depth knowledge of the services industry and their IT systems Practical cloud native experience Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Java Full Stack Developmentability to create medium large sized Java web applications from start to finish on their own. This includes but is not limited to the followingclient interaction, validating requirements, system design, frontend/UI development, interaction with a Java EE application server, web services, experience with the various Java EE APIs, development builds, application deployments, integration/enterprise testing, and support of applications within a production environment. Experience with Java/J2EE with a deep understanding of the language and core APIs, web services, multi threadedor concurrent programming, XML, design patterns, Service Oriented Architecture. Experience in implementing Micro services using Spring Boot and Event Driven architecture. Work with a team that develops smart and scalable solutions and provide a solid experience for our users. Develop an understanding of our products and the problems we are attempting to solve. Analyze infrastructure problems/constraints, inefficiencies, process gaps, risk and regulatory issues and engineer software or automation solutions Work in partnership with infrastructure engineers and architects to understand and identify operational improvements. Tech skills Java API, Microservices UI React Javascript AWS ECS Postgres Lambda, S3, Route53, SNS, SQS Infrastracture as Code concepts TestingjUnit, AFT (Selenium/Cucumber/Gherkin), Blazemeter perf testing Python (for Lambda functions) People skills Ability to quickly absorb knowledge as it relates to our application (existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on If not clear or unsure about something, immediately raise up Ability to problem solve issues that arise
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Bachelors degree in Computer Science or related field, or equivalent practical experience. 3 to 5 years of hands-on experience in Java development, especially with Spring Boot. Experience working with AWS cloud services in a development environment. Knowledge of RESTful APIs, microservices, and distributed systems. Familiarity with CI/CD pipelines and version control tools such as Git. Strong problem-solving skills and a collaborative mindset. Design, develop, and deploy backend services and APIs using Java and Spring Boot. Develop and integrate applications with AWS services such as Lambda, S3, RDS, API Gateway, DynamoDB, and SQS. Write clean, maintainable, and testable code following best practices. Collaborate with cross-functional teams to define, design, and deliver new features. Participate in code reviews, unit testing, and CI/CD pipeline processes. Troubleshoot issues and improve performance, reliability, and scalability of existing systems. Ensure security, compliance, and performance standards are met across deployments
Posted 1 month ago
12.0 - 17.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Bachelors degree in Computer Science, Engineering, or related field. 10 plus years of professional experience in Java development. 3+ years of experience designing and developing solutions in AWS cloud environments. Strong expertise in Java 8+, Spring Boot, RESTful API design, and microservices architecture. Hands-on experience with key AWS servicesLambda, API Gateway, S3, RDS, DynamoDB, ECS, SNS/SQS, CloudWatch. Solid understanding of infrastructure-as-code (IaC) tools like Terraform, AWS CloudFormation, or CDK. experience with Agile/Scrum, version control (Git), and CI/CD pipelines. Strong communication and leadership skills, including leading distributed development teams. Lead end-to-end technical delivery of cloud-native applications built on Java and AWS. Design and architect secure, scalable, and resilient systems using microservices and serverless patterns. Guide the team in implementing solutions using Java (Spring Boot, REST APIs) and AWS services (e.g., Lambda, API Gateway, DynamoDB, S3, ECS, RDS, SNS/SQS). Participate in code reviews, ensure high code quality, and enforce clean architecture and design principles. Collaborate with DevOps engineers to define CI/CD pipelines using tools such as Jenkins, GitLab, or AWS CodePipeline. Mentor and coach developers on both technical skills and Agile best practices. Translate business and technical requirements into system designs and implementation plans. Ensure performance tuning, scalability, monitoring, and observability of deployed services. Stay current with new AWS offerings and Java development trends to drive innovation.
Posted 1 month ago
5.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Data Engineer Must have 9+ years of experience in below mentioned skills. Must HaveBig Data Concepts Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to HaveEvent-driven/AWA SQS, Microservices, API Development,Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora
Posted 1 month ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Responsibilities Design and Develop Scalable Data PipelinesBuild and maintain robust data pipelines using Python to process, transform, and integrate large-scale data from diverse sources. Orchestration and AutomationImplement and manage workflows using orchestration tools such as Apache Airflow to ensure reliable and efficient data operations. Data Warehouse ManagementWork extensively with Snowflake to design and optimize data models, schemas, and queries for analytics and reporting. Queueing SystemsLeverage message queues like Kafka, SQS, or similar tools to enable real-time or batch data processing in distributed environments. CollaborationPartner with Data Science, Product, and Engineering teams to understand data requirements and deliver solutions that align with business objectives. Performance OptimizationOptimize the performance of data pipelines and queries to handle large scales of data efficiently. Data Governance and SecurityEnsure compliance with data governance and security standards to maintain data integrity and privacy. DocumentationCreate and maintain clear, detailed documentation for data solutions, pipelines, and workflows. Qualifications Required Skills: 5+ years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka, AWS SQS, or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment, collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink. Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment.
Posted 1 month ago
10.0 - 15.0 years
12 - 16 Lacs
Hyderabad
Work from Office
JD for Data Engineering Lead - Python: Data Engineering Lead with at least 7 to 10 years experience in Python with following AWS Services AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3, API Gateway, and CloudWatch. Providing architectural guidance to the offshore team,7-10, reviewing code and troubleshoot errors. Very strong SQL knowledge is a must, should be able to understand & build complex queries. Familiar with Gitlab( repos and CI/CD pipelines). He/she should be closely working with Virtusa onshore team as well as enterprise architect & other client teams at onsite as needed. Experience in API development using Python is a plus. Experience in building MDM solution is a plus.
Posted 1 month ago
4.0 - 9.0 years
4 - 8 Lacs
Gurugram
Work from Office
Data Engineer Location PAN INDIA Workmode Hybrid Work Timing :2 Pm to 11 PM Primary Skill Data Engineer Experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark.. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Also, experience in Redshift is required along with other SQL DB experience Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. understanding of building an end-to end Data pipeline. Strong understanding of Kinesis, Kafka, CDK. Experience with Kafka and ECS is also required. strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required Experience in Node Js and CDK. JDResponsibilities Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark. Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements. Implement data pipeline for data Ingestion, transformation and extraction leveraging the AWS Cloud Services Seamlessly integrate a variety of AWS services, including S3,Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline. Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios. Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data. Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline. QualificationsMust Have Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Strong working experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Complete understanding of building an end-to end Data pipeline. Nice to have Strong understanding of Kinesis, Kafka, CDK. A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. Experience with Kafka and ECS is also required.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Primary Skills: - Java, React Front end React Backend TechJava, Spring, REST services, Spring JPA AWS Cloud knowledgeLambda, API g/w, Step Fn, SQS, SNS, S3, Cloudwatch DatabaseMongo DB, RDS Should have good experience working in agile team Should have decent experience in designing/Architect Should be able to contribute in stand alone roles. Secondary Skills AWS
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Hyderabad
Work from Office
The candidate should have a strong academic background from a reputable institution, possess excellent communication skills, be capable of handling independent responsibilities, and demonstrate the ability to initiate and lead a team Technical skills Primary Skills: - Backend TechJava, Spring, REST services, Spring JPA AWS Cloud knowledgeLambda, API g/w, Step Fn, SQS, SNS, S3, Cloudwatch DatabaseMongo DB, RDS Secondary skills:- Frontend TechReact, Node js Backend Tech:- Python, Message Queue. Overall 15 + Years experience Should have good experience working in agile team Should have decent experience in designing/Architecture Should be able to explore new services on AWS and do POCs Should be able to contribute in stand-alone roles.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
JD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Candidate should have worked on LABMDA based APIs development. Should have experience in using following AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely working with enterprise architect & other client teams at onsite as needed. Having experience in building solutions using Kafka would be good value addition(optional).
Posted 1 month ago
7.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Information Lifecycle management ILM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Information Lifecycle management ILM- Strong understanding of data lifecycle management- Experience in data archiving and retention policies- Knowledge of SAP data management solutions- Hands-on experience in SAP data migration- Experience in SAP data governance Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP Information Lifecycle management ILM- This position is based at our Hyderabad office- A BE degree is required Qualification BE
Posted 1 month ago
2.0 - 5.0 years
12 - 16 Lacs
Noida
Work from Office
About The Role We at Innovaccer are looking for a Security Engineer-I who will be a part of our eyes on the glass team i.e. individual would be responsible to perform real time monitoring and analysis of the security events. This role will encompass the use of a broad range of security domains (Event Monitoring, Endpoint Security, Incident Management). This role is not a typical monitoring environment, however, would be a great opportunity to learn and grow as you would be exposed to multiple security domains at single time. A Day in the Life This role requires being available on call during weekends and off hours. Perform monitoring and incident response of cyber security events as part of a highly available Security Operation Center (SecOps) Familiarization with multi-cloud setup i.e. AWS, Azure, GCP Ability to read and interpret security related logs from disparate sources Perform real-time monitoring, vulnerability management, security incident handling, investigation, analysis, reporting and escalations of security events generated through various security solutions deployed like SIEM, IDS/IPS, FIM etc. Administration and Contribution to the configuration and maintenance of security solutions such as XDR, Data Leak Prevention, Host Intrusion Detection Systems (HIDS), Network Intrusion Detection Systems (NIDS), and Security Information and Event Management (SIEM). Integration of devices like Linux and Windows machines, Antivirus, Firewalls, IDS/IPS, Web Servers etc. Triage, Investigate, document, and report on information security events. Develop and follow detailed operational processes, procedures and playbooks to appropriately analyze, escalate and assist in the remediation of information security related incidents Understanding of TCP/IP, IPSEC, Syslog and other network protocols ? Work closely with DevOps, SRE, Engineering, Product departments to remediate security related issues and incidents Good to have Scripting & Automation skills CEH, AWS Cloud Practitioner, AZ-900 or similar certification is desirable What You Need Bachelors degree in Information Technology, Computer Science Engineering preferred Minimum of 2 to 5 years of prior experience as a Security Analyst Prior experience with core security technologies (SIEM, Firewalls, IDS/IPS, AV, DLP etc.). Understanding of NIST & CIS Benchmarks, OWASP Top 10 & SANS Top 25 Strong understanding of TCP/IP Protocols, network analysis, security applications and devices, vulnerability management, and standard Internet protocols and applications. Experience in one or more security information and event monitoring tools (SIEM). Familiar with AWS Security Hub, Azure Security Center, AWS S3, AWS Inspector, Azure Security Center, EKS, ECS, AKS, etc. Familiar with Amazon AWS/Microsoft Azure services as an IaaS/PaaS containers (Dockers/Kubernetes) Able to work independently, being a team player, ability to work well under pressure Able to multi-task, prioritize, and manage time effectively Collaborates effectively and communicates efficiently Proficient in open source tools & technologies Work in a 24x7 environment and willing to work in all shifts. Ready to take up more responsibilities along-with existing role Capable to understand tools & their backend logic and be open to work with open source solutions We offer competitive benefits to set you up for success in and outside of work. Heres What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough