Jobs
Interviews

22 Cdk Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

The Max Maintenance team is currently in search of an experienced Principal Software Architect to take charge of leading the modernization and cloud transformation of a legacy .NET web application with a SQL Server backend. This role necessitates a profound understanding of AWS cloud services, including API Gateway, AWS Lambda, Step Functions, DynamoDB, and Neptune, in order to re-architect the system into a scalable, serverless, event-driven platform. The ideal candidate for this position will possess a robust architectural vision, hands-on technical proficiency, and a dedication to mentoring and guiding development teams through digital transformation initiatives. Are you someone who thrives in a fast-paced and dynamic team environment If so, we invite you to join our diverse and motivated team. Key Responsibilities: - Lead the comprehensive cloud transformation strategy for a legacy .NET/SQL Server web application. - Develop and deploy scalable, secure, and serverless AWS-native architectures with services like API Gateway, AWS Lambda, Step Functions, DynamoDB, and Neptune. - Establish and execute data migration plans, transitioning relational data models into NoSQL (DynamoDB) and graph-based (Neptune) storage paradigms. - Set standards for infrastructure-as-code, CI/CD pipelines, and monitoring utilizing AWS CloudFormation, CDK, or Terraform. - Offer hands-on technical guidance to development teams, ensuring high code quality and compliance with cloud-native principles. - Assist teams in adopting cloud technologies, service decomposition, and event-driven design patterns. - Mentor engineers in AWS technologies, microservices architecture, and best practices in DevOps and modern software engineering. - Develop and evaluate code for critical services, APIs, and data access layers using appropriate languages (e.g., Python, Node.js). - Create and implement APIs for both internal and external consumers, ensuring secure and dependable integrations. - Conduct architecture reviews, threat modeling, and enforce strict testing practices, including automated unit, integration, and load testing. - Collaborate closely with stakeholders, project managers, and cross-functional teams to define technical requirements and delivery milestones. - Translate business objectives into technical roadmaps and prioritize technical debt reduction and performance enhancements. - Engage stakeholders to manage expectations and provide clear communication on technical progress and risks. - Stay informed about AWS ecosystem updates, architectural trends, and emerging technologies. - Assess and prototype new tools, services, or architectural approaches that can expedite delivery and decrease operational complexity. - Advocate for a DevOps culture emphasizing continuous delivery, observability, and security-first development. Requirements: - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - Minimum of 8 years of software development experience, with at least 3 years focused on architecting cloud-native solutions on AWS. - Proficiency in AWS services like API Gateway, Lambda, Step Functions, DynamoDB, Neptune, IAM, CloudWatch. - Experience in legacy application modernization and cloud migration. - Strong familiarity with the .NET stack and the ability to map legacy components to cloud-native equivalents. - Extensive knowledge of distributed systems, serverless design, data modeling (both relational and NoSQL/graph), and security best practices. - Demonstrated leadership and mentoring skills within agile software teams. - Exceptional problem-solving, analytical, and decision-making capabilities. The oil and gas industry's top professionals leverage over 150 years of combined experience every day to assist customers in achieving enduring success. We Power the Industry that Powers the World Our family of companies has delivered technical expertise, cutting-edge equipment, and operational assistance across every region and aspect of drilling and production, ensuring current and future success. Global Family We operate as a unified global family, comprising thousands of individuals working together to make a lasting impact on ourselves, our customers, and the communities we serve. Purposeful Innovation Through intentional business innovation, product development, and service delivery, we are committed to enhancing the industry that powers the world. Service Above All Our commitment to anticipating and meeting customer needs drives us to deliver superior products and services promptly and within budget.,

Posted 21 hours ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

This role requires you to be adept at troubleshooting, debugging, and working within a Cloud environment. You should be familiar with Agile and other development methodologies. Your responsibilities will include creating lambda functions with all the necessary security measures in place using AWS Lambda. You must demonstrate proficiency in Java & Node JS by developing services and conducting unit and integration testing. It is essential to have a strong understanding of security best practices such as using IAM Roles, KMS, and Pseudonymization. You should be able to define services on Swagger Hub and implement serverless approaches using AWS Lambda, including the Serverless Application Model (AWS SAM). Hands-on experience with RDS, Kafka, ELB, Secret Manager, S3, API Gateway, CloudWatch, and Event Bridge services is required. You should also be knowledgeable in writing unit test cases using the Mocha framework and have experience with Encryption & Decryption of PII data and file on Transit and at Rest. Familiarity with CDK (Cloud Development Kit) and creating SQS/SNS, DynamoDB, API Gateway using CDK is preferred. You will be working on a serverless stack involving Lambda, API Gateway, Step functions, and coding in Java / Node JS. Advanced networking concepts like Transit Gateway, VPC endpoints, and multi-account connectivity are also part of the role. Strong troubleshooting and debugging skills are essential, along with excellent problem-solving abilities and attention to detail. Effective communication skills and the ability to work in a team-oriented, collaborative environment are crucial for success in this role. Virtusa is a company that values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team that focuses on your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career. Collaboration and fostering excellence are at the core of Virtusa's values, offering a dynamic environment for great minds to thrive and innovate.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You have a fantastic opportunity to join our team as a Software Development Engineer with a focus on implementing Identity and Access Management systems at scale. You will be utilizing your expertise in software development and modern enterprise architectures to contribute to exciting projects in industries such as High-Tech, communication, media, healthcare, retail, and telecom. Your role will involve working with global brands and leaders to build innovative products and platforms for the modern world. As a Software Development Engineer, you will need to have a minimum of 5 years of experience in software development with .Net, focusing on platform and API development. Additionally, you should possess at least 2 years of experience in software development using JavaScript, TypeScript, React, or Python. Experience with Identity and Access Management systems, CI/CD practices, cloud architectures, and containerization methodologies will be crucial for success in this role. You will have the opportunity to showcase your technical leadership skills by providing guidance to engineering teams as a team lead, technical architect, or subject matter expert. Strong written and oral communication skills, along with excellent analytical and problem-solving abilities, will be essential for effective collaboration within the team. At GlobalLogic, we prioritize work-life balance and offer a collaborative environment where you can expand your skills by working with a diverse and talented team. We provide professional development opportunities, competitive salaries, excellent benefits including medical insurance and pension schemes, and fun perks such as sports events, cultural activities, and corporate parties. If you are passionate about digital engineering and want to be part of a dynamic team that accelerates innovation in the digital world, GlobalLogic is the place for you. Join us in designing and building cutting-edge products and experiences that shape the future of technology across various industries. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands worldwide create innovative products and platforms. With a focus on experience design, complex engineering, and data expertise, we enable our clients to envision new possibilities and drive their digital transformation. Headquartered in Silicon Valley, GlobalLogic operates globally, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media, semiconductor, and technology. Join GlobalLogic, a Hitachi Group Company, and be part of a team that drives innovation through data and technology, contributing to a sustainable society with a higher quality of life.,

Posted 4 days ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

Pune, Delhi / NCR

Hybrid

Job Description Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Qualifications Bachelors degree in computer science, Engineering, or a related field. 6-7 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications AWS data developers with 6-10 years experience certified candidates (AWS data engineer associate or AWS solution architect) are preferred Skills required - SQL, AWS Glue, PySpark, Air Flow, CDK, Red shift Good communication skills and can deliver independently

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

You have an exciting opportunity to join as a DevSecOps in Sydney. As a DevSecOps, you should have 3+ years of extensive Python proficiency and 3+ years of Java Experience. Your role will also require extensive exposure to technologies such as Javascript, Jenkins, Code Pipeline, CodeBuild, and AWS" ecosystem including AWS Well Architected Framework, Trusted Advisor, GuardDuty, SCP, SSM, IAM, and WAF. It is essential for you to have a deep understanding of automation, quality engineering, architectural methodologies, principles, and solution design. Hands-on experience with Infrastructure-As-Code tools like CloudFormation and CDK will be preferred for automating deployments in AWS. Moreover, familiarity with operational observability, including log aggregation, application performance monitoring, deploying auto-scaling and load-balanced / Highly Available applications, and managing certificates (client-server, mutual TLS, etc) is crucial for this role. Your responsibilities will include improving the automation of security controls, working closely with the consumer showback team on defining processes and system requirements, and designing and implementing updates to the showback platform. You will collaborate with STO/account owners to uplift the security posture of consumer accounts, work with the Onboarding team to ensure security standards and policies are correctly set up, and implement enterprise minimum security requirements from the Cloud Security LRP, including Data Masking, Encryption monitoring, Perimeter protections, Ingress / Egress uplift, and Integration of SailPoint for SSO Management. If you have any questions or need further clarification, feel free to ask.,

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Navi Mumbai, Pune, Bengaluru

Work from Office

Roles and Responsibilities Design, develop, test, and deploy scalable cloud-based solutions on AWS using CDK8s. Collaborate with cross-functional teams to identify business requirements and design technical solutions that meet those needs. Implement automated testing frameworks using TypeScript to ensure high-quality code delivery. Participate in IAC (Infrastructure as Code) initiatives to manage infrastructure configuration using Terraform or CloudFormation. Troubleshoot issues related to application performance, scalability, and reliability.

Posted 1 week ago

Apply

3.0 - 6.0 years

12 - 16 Lacs

Thiruvananthapuram

Work from Office

AWS Cloud Services (Glue, Lambda, Athena, Lakehouse) AWS CDK for Infrastructure-as-Code (IaC) with typescript Data pipeline development & orchestration using AWS Glue Strong programming skills in Python, Pyspark, Spark SQL, Typescript Required Candidate profile 3 to 5 Years Client-facing and team leadership experience Candidates have to work with UK Clients, Work timings will be aligned with the client's requirements and may follow UK time zones

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for leading the modernization of our flagship data modeling desktop product into a scalable, cloud-native SaaS platform. This role requires a combination of deep technical expertise, architecture leadership, and domain knowledge in data modeling, databases, and platform scalability. Your primary responsibilities will include architecting and driving the cloud transformation of a legacy desktop-based data modeling product into a multi-tenant, cloud-native application. You will define the modern architecture, including microservices, APIs, cloud storage strategies, and user/data management. Additionally, you will lead the design and integration of collaborative, browser-based modeling features, version control, and real-time updates. As a Principal Engineer, you will create migration strategies for data-intensive features, schema management, model validation, and history tracking. Collaboration with Product, UX, DevOps, and QA teams is crucial to ensure platform scalability, resilience, and extensibility. Providing technical leadership across teams, mentoring engineers, and establishing best practices for data platform engineering will be part of your role. You will champion CI/CD, DevSecOps, and infrastructure-as-code (IaC) in a cloud environment (AWS/Azure/GCP) while ensuring compliance with enterprise security, governance, and privacy regulations. To qualify for this role, you should have at least 10 years of software engineering experience, with a minimum of 3 years leading cloud transformation or re-platforming initiatives. Deep expertise in modernizing data-centric desktop applications (.NET, Java, C++ or similar) to web/cloud platforms is required. Additionally, you should possess a strong understanding of data modeling concepts, ER modeling, schema design tools, and versioning mechanisms. Proficiency in cloud-native architecture, including microservices, serverless, containerization (Docker, Kubernetes), and cloud databases, is essential. Experience with real-time collaboration, diagramming tools, or similar interactive data tools will be beneficial. Strong skills in API-first design (REST/GraphQL), scalable backend frameworks, asynchronous processing, cloud platform services (Azure preferred), IaC tools (Terraform/CDK), CI/CD pipelines, and monitoring systems are also required. In this role, you will demonstrate exceptional problem-solving skills and the ability to lead by example in hands-on coding and system design. Join us at Quest, where we fulfill promises by managing, modernizing, and securing business software. We offer a collaborative environment with dedicated professionals passionate about technology. Your health and wellness are our priority, and we invest in programs to help you pursue a fulfilling career. Visit Quest Careers | Where next meets now to learn more and join us in conquering challenges together.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

30 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

At YASH, were a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. Job Description : An AWS DevOps Architect designs and manages the DevOps environment for an organization. They ensure that software development and IT operations are integrated seamlessly. DevOps strategy: Develop and implement the DevOps strategy and roadmap Automation: Automate the provisioning, configuration, and management of infrastructure components Cloud architecture: Design and manage the cloud and infrastructure architecture Security: Implement security measures and compliance controls Collaboration: Foster collaboration between development, operations, and other cross-functional teams Continuous improvement: Regularly review and analyze DevOps processes and practices Reporting: Provide regular reports on infrastructure performance, costs, and security to management Skills and experience Experience with AWS services like ECS, EKS, and Kubernetes Knowledge of scripting languages like Python Experience with DevOps tools and technologies like Jenkins, Terraform, and Ansible Experience with CI/CD pipelines Experience with cloud governance standards and best practices

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You should possess expert-level proficiency in Python and Python frameworks or Java. Additionally, you must have hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, CloudFormation, Docker, ECS, Fargate, and ECR. Your deep experience should cover key AWS services such as Compute (PySpark, Lambda, ECS), Storage (S3), Databases (DynamoDB, Snowflake), Networking (VPC, 53, CloudFront, API Gateway), DevOps/CI-CD (CloudFormation, CDK), Security (IAM, KMS, Secrets Manager), and Monitoring (CloudWatch, X-Ray, CloudTrail). Moreover, you should be proficient in NoSQL Databases like Cassandra, PostgreSQL, and have strong hands-on knowledge of using Python for integrations between systems through different data formats. Your expertise should extend to deploying and maintaining applications in AWS, with hands-on experience in Kinesis streams and Auto-scaling. Designing and implementing distributed systems and microservices, scalability, high availability, and fault tolerance best practices are also key aspects of this role. You should have strong problem-solving and debugging skills, with the ability to lead technical discussions and mentor junior engineers. Excellent communication skills, both written and verbal, are essential. You should be comfortable working in agile teams with modern development practices, collaborating with business and other teams to understand business requirements and work on project deliverables. Participation in requirements gathering and understanding, designing solutions based on available frameworks and code, and experience with data engineering tools or ML platforms (e.g., Pandas, Airflow, SageMaker) are expected. An AWS certification (AWS Certified Solutions Architect or Developer) would be advantageous. This position is based in multiple locations in India, including Indore, Mumbai, Noida, Bangalore, and Chennai. To qualify, you should hold a Bachelor's degree or a foreign equivalent from an accredited institution. Alternatively, three years of progressive experience in the specialty can be considered in lieu of each year of education. A minimum of 8+ years of Information Technology experience is required for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You are seeking a hands-on backend expert to elevate your FastAPI-based platform to the next level by developing production-grade model-inference services, agentic AI workflows, and seamless integration with third-party LLMs and NLP tooling. In this role, you will be responsible for various key areas: 1. Core Backend Enhancements: - Building APIs - Strengthening security with OAuth2/JWT, rate-limiting, SecretManager, and enhancing observability through structured logging and tracing - Adding CI/CD, test automation, health checks, and SLO dashboards 2. Awesome UI Interfaces: - Developing UI interfaces using React.js/Next.js, Redact/Context, and various CSS frameworks like Tailwind, MUI, Custom-CSS, and Shadcn 3. LLM & Agentic Services: - Designing micro/mini-services to host and route to platforms such as OpenAI, Anthropic, local HF models, embeddings & RAG pipelines - Implementing autonomous/recursive agents that orchestrate multi-step chains for Tools, Memory, and Planning 4. Model-Inference Infrastructure: - Setting up GPU/CPU inference servers behind an API gateway - Optimizing throughput with techniques like batching, streaming, quantization, and caching using tools like Redis and pgvector 5. NLP & Data Services: - Managing the NLP stack with Transformers for classification, extraction, and embedding generation - Building data pipelines to combine aggregated business metrics with model telemetry for analytics You will be working with a tech stack that includes Python, FastAPI, Starlette, Pydantic, Async SQLAlchemy, Postgres, Docker, Kubernetes, AWS/GCP, Redis, RabbitMQ, Celery, Prometheus, Grafana, OpenTelemetry, and more. Experience in building production Python REST APIs, SQL schema design in Postgres, async patterns & concurrency, UI application development, RAG, LLM/embedding workflows, cloud container orchestration, and CI/CD pipelines is essential for this role. Additionally, experience with streaming protocols, NGINX Ingress, SaaS security hardening, data privacy, event-sourced data models, and other related technologies would be advantageous. This role offers the opportunity to work on evolving products, tackle real challenges, and lead the scaling of AI services while working closely with the founder to shape the future of the platform. If you are looking for meaningful ownership and the chance to solve forward-looking problems, this role could be the right fit for you.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Nashik

Work from Office

Exp as a DevOps Engineer with AWS CDK. Experience with NodeJS is. Utilize AWS CodeBuild and AWS CodeDeploy for building and deploying applications in a scalable and reliable manner. Required Candidate profile Proficient in AWS services, specifically AWS CodeBuild, AWS CodeDeploy, and AWS CDK.

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Nagpur

Work from Office

Exp as a DevOps Engineer with AWS CDK. Experience with NodeJS is. Utilize AWS CodeBuild and AWS CodeDeploy for building and deploying applications in a scalable and reliable manner. Required Candidate profile Proficient in AWS services, specifically AWS CodeBuild, AWS CodeDeploy, and AWS CDK.

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

Exp as a DevOps Engineer with AWS CDK. Experience with NodeJS is. Utilize AWS CodeBuild and AWS CodeDeploy for building and deploying applications in a scalable and reliable manner. Required Candidate profile Proficient in AWS services, specifically AWS CodeBuild, AWS CodeDeploy, and AWS CDK.

Posted 1 month ago

Apply

7.0 - 11.0 years

25 - 40 Lacs

Bengaluru

Work from Office

JD Role : Tech Lead/Developer Work Environment: Hybrid(4 Days Office+1@ Remote) Engagement: Full Time Permanent Loc : Madiwala(Bangalore) Required: Senior Node.js developer with hands-on experience building serverless applications using AWS Lambda, API Gateway, RDS, SQS, SNS, and Timestream. Database: Proficiency with Sequelize ORM for database modelling, queries, and migrations in production environments. Infrastructure: Must have experience with either SST (Serverless Stack Framework) or Pulumi for AWS infrastructure as code deployment and management. Architecture (Good to have): Understanding of event-driven serverless architecture, message queues, pub/sub patterns, and time-series data processing.

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 10 Lacs

Kolkata

Work from Office

Job Title: Job Controller Automobile Service Location: Ruby, Anandapur, Kolkata Experience Required: 58 Years in Automotive Service Operations Preferred Candidate: Male About the Company: We are a leading luxury automobile dealership offering world-class vehicles and unmatched customer experiences. Our service department is the backbone of our promise to deliver premium after-sales support. Role Summary: The Job Controller acts as the key coordinator between the service advisors, workshop technicians, and customers. The role ensures efficient job allocation, repair quality, timely vehicle delivery, and overall workshop productivity. Key Responsibilities: Allocate daily repair and service jobs to technicians based on skillset and workload. Monitor job progress and ensure timely completion of work as per service timelines. Review job cards for accuracy and completeness; ensure correct labor codes and parts usage. Coordinate with service advisors to prioritize urgent or VIP customer vehicles. Conduct quality checks on completed jobs before delivery to ensure service standards. Ensure workshop capacity is fully utilized without overloading technicians. Track individual technician performance and provide feedback to service manager. Ensure compliance with safety and brand technical guidelines in the workshop. Collaborate with spare parts team to ensure timely parts availability. Report daily productivity, efficiency, and WIP (Work In Progress) updates to the service manager. Desired Candidate Profile: 58 years of experience in a car dealership service department, preferably with luxury or premium brands. Strong knowledge of mechanical/electrical vehicle systems and diagnostics. Proven experience in job card handling, technician supervision, and service planning. Good communication and coordination skills. Exposure to DMS (Dealer Management Systems) like Autoline, CDK, or similar. Educational Qualification: Diploma or Degree in Automobile Engineering / Mechanical Engineering Certification from OEM training will be an added advantage

Posted 1 month ago

Apply

5.0 - 7.0 years

14 - 16 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Job Title: Data/ML Platform Engineer Location: Gurgaon, Pune, Bangalore, Chennai, Bhopal, Jaipur, Hyderabad (Work from office) Notice Period: ImmediateiSource Services is hiring for one of their client for the position of Data/ML Platform Engineer. As a Data Engineer you will be relied on to independently develop and deliver high-quality features for our new ML Platform, refactor and translate our data products and finish various tasks to a high standard. Youll be part of the Data Foundation Team, which focuses on creating and maintaining the Data Platform for Marktplaats. 5 years of hands-on experience in using Python, Spark,Sql. Experienced in AWS Cloud usage and management. Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow). Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch. Experience with orchestrators such as Airflow and Kubeflow. Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes). Fundamental understanding of Parquet, Delta Lake and other data file formats. Proficiency on an IaC tool such as Terraform, CDK or CloudFormation. Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst Location - Gurgaon, Pune, Bangalore, Chennai, Bhopal, Jaipur, Hyderabad (Work from office)

Posted 1 month ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Pune

Hybrid

Software Engineer Baner, Pune, Maharashtra Department Software & Automation Employee Type Permanent Experience Range 5 - 8 Years Qualification: Bachelor's or master's degree in computer science, IT, or related field. Roles & Responsibilities: Technical Role: Architect and build scalable data pipelines using AWS and Databricks. Integrate data from sensors (Cameras, Lidars, Radars). Deliver proof-of-concepts and support system improvements. Ensure data quality and scalable design in solutions. Strong Python, Databricks (SQL, PySpark, Workflows), and AWS skills. Solid leadership and mentoring ability. Agile development experience. Additional Skill: Good to Have: AWS/Databricks certifications. Experience with Infrastructure as Code (Terraform/CDK). Exposure to machine learning data workflows. Software Skills: Python Databricks (SQL, PySpark, Workflows) AWS (S3, EC2, Glue) Terraform/CDK (good to have)

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Summary : As a Senior Product Security Engineer, you will join our team of talented professionals dedicated to embedding continuous and seamless security into our engineering processes. You will contribute to the development and implementation of our Secure Software Development Lifecycle (S-SDLC), working across multiple technical teams to enhance our security posture. About the role : Promote secure-by-design architectures and implementations across all phases of our S-SDLC. Define product security standards, best practices, and processes with built-in governance and metrics. Develop new security capabilities, patterns and automation to integrate security throughout our development practices. Lead threat modeling sessions and secure code reviews (including of AI-based systems and products). Collaborate with cross-functional teams, including software engineering, platform engineering, QA, and operations. Accelerate security remediation through data analysis and support for product engineering teams. This central role will allow you to have maximum impact ensuring our products and applications meet the highest security standards to protect our customers. About you : Bachelor's degree in computer science or equivalent education experience. 7+ years of hands-on experience in software engineering or application security. Experience conducting security-focused threat modeling and code reviews across multiple technology stacks and programming languages. Experience with security tools (SAST, SCA, DAST, fuzzers a plus) and analyzing their findings. Proven analytical skills with ability to develop innovative solutions to complex security challenges. Both defensive and offensive mindset. Strong understanding of security principles (cryptography, authentication, authorization, etc.) and common vulnerabilities applicable to applications (web, desktop or mobile), APIs and cloud environments. Ability to identify, analyze, and mitigate common security vulnerabilities at both design and implementation levels. Knowledge of software engineering principles with experience designing and implementing secure systems, aligned with secure by design and secure by default principles Proficiency in writing code, tests, deployment logic, and API integrations. Any language welcomed. Python, GoLang, Java preferred. Excellent written and verbal communication skills with ability to articulate complex security concepts to diverse and cross-functional audiences. Preferred Qualifications Experience with a major cloud provider (AWS, Azure, Oracle Cloud or GCP). Experience with Infrastructure as Code (e.g., CDK, Terraform, ). Experience securing or developing systems using Large Language Models, RAG, and AI Agents. Experience with common authentication and authorization standards (SAML and OAuth). Experience with containerized application and container orchestration (Kubernetes, ECS, ). Knowledge of industry security frameworks and maturity models such as OWASP Application Security Verification Standard, CIS Benchmarks, NIST Cybersecurity Framework, OWASP SAMM or BSIMM. Relevant security certifications (e.g., OSCP, OSWE). Experience contributing to open-source security projects. Experience in security research, presenting at conferences, or publishing articles.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Noida, Pune, Gurugram

Work from Office

We are hiring AWS developer with Banking or Financial Domain experience AWS Developer Location of Job - Noida, Gurugram and Pune Shift Timing - 1:00PM-10:00 PM Job Description Must Have Skills Domain - Financial or Banking Expertise in AWS CDK or terraform, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Interested candidates please share your updated resume on anu.c@irissoftware.com with below details - Current company- Current CTC- Expected CTC- Relevant experience in AWS- Any experience in CDK or Terraform- Notice Period, If serving please share your LWD- Current location- Open for which location Noida, Gurugram and Pune- Open for shift time 01:00 pm to 10:00 pm - Regards, Anu

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 18 Lacs

Pune

Work from Office

Design and manage CI/CD pipelines with Jenkins for automated builds, tests, and deployments. Optimize AWS infrastructure, automate workflows with scripting, ensure system reliability, and collaborate across teams for seamless software delivery. Required Candidate profile 5–8 yrs in DevOps with strong AWS (EC2, S3, Lambda), Jenkins, Git, scripting (Shell/PowerShell), & programming (Python/Java). Skilled in CDK, monitoring tools, & troubleshooting in fast-paced setups. Perks and benefits Best in the industry.

Posted 2 months ago

Apply

5 - 8 years

8 - 14 Lacs

Bengaluru

Work from Office

- Design, architecture and development of our flagship product. - Leading a team of 4-6 people and responsible for getting work done from the team. - To liaise with the product manager and technical architect to explore and suggest. - High quality technical solutions to achieve the required product features as well as monitoring technical progress against plans. - While safeguarding functionality, scalability and performance.- Actively participate in code reviews to build robust applications and prototypes.- Ensure high scalability and performance of the platform. - Build proof of concepts and early prototypes systems and scale it to production.- Advocate for good, clean, well-documented and performing code; define and follow standards and best practices for front-end development. We want you if you have : - Over 5 years of proven work experience in a product-based company delivering mission critical projects. - Design, code, test, debug, and document software according to functional requirements. - Solve complex performance problems and architectural challenges. - Provide cost-optimized solutions and approaches while addressing scalability and performance concerns. - Expert knowledge in databases, language runtimes, multithreading, caches and different types of architectures. - Knowledge of setting up right monitoring and alerts for your services. - Extensive experience with AWS, particularly in CDK, Lambda, ECS etc. - Extensive experience in message queue systems, including Celery, RabbitMQ, and Kafka. - A strong sense of ownership and quality for your deliverables. Good to Have : - Working experience with Python and its frameworks OR at least 2 different languages.- Experience in building analytics or reporting engines is a plus. - Experience in full-stack development. - Even if you don't meet all these skills set above, we'd still love to hear from you. - Tell us about your unique qualifications and why we need to consider your resume in your cover letter.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies