Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
10 - 20 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Work from Office
Role : AWS / Azure SME Experience : 8+ years Location : PAN INDIA (Preferably Mumbai) Work From Office AWS SME : AWS infrastructure management Understanding and troubleshooting AWS resources with the help of DevOps practices Must have in-depth knowledge and hands on experience on EC2 S3 EBS EKS ECR RDSIAM etc services in AWS Azure SME : Infrastructure Engineering Design implement and manage IT infrastructure to support business applications and services Azure Cloud Operations Support Provide operational support for Azure cloud environments monitor system performance troubleshoot issues and optimize cloud infrastructure for reliability and efficiency Support Storage Provide operational support for Azure storage solutions monitor storage performance troubleshoot issues and optimize storage infrastructure Develop and maintain documentation including architecture diagrams standard operating procedures and incident response plans Collaborate with internal teams and thirdparty vendors to ensure seamless operations and robust infrastructure solutions Ensure compliance with security policies and best practices across all areas of responsibility Stay updated with the latest Azure features tools and best practices
Posted 1 month ago
7.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Mumbai, Goregaon
Work from Office
Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity
Posted 1 month ago
2.0 - 4.0 years
4 - 6 Lacs
Gurugram
Work from Office
About the role: We are seeking a dynamic and visionary Nodejs Developer to lead the data strategy at Fitelo, a fast-growing health and wellness platform.Youll collaborate with a team of innovative thinkers, front-end geniuses, and domain experts to design robust architectures, implement efficient APIs, and ensure our systems are lightning-fast and rock-solid. Whether its building new features, solving complex challenges, or improving performance, youll be at the core of it all.This isnt just about writing code; its about shaping the future of health and wellness tech. If you lovecrafting elegant solutions, thinking outside the box, and making an impact, wed love to have you onboard. Specifically, this role will involve: Taking complete ownership of the design, development, deployment, and maintenance of server- side components and APIs using Node.js. Managing the entire lifecycle of database operations with MongoDB and PostgreSQL from schema design to performance tuning and troubleshooting. Collaborating with front-end developers to ensure seamless integration of user-facing elements with server-side logic, delivering a flawless user experience. Optimizing application performance and scalability through proactive monitoring, debugging, and implementing best practices. Implementing and maintaining security protocols to safeguard data integrity and protect sensitive information, ensuring compliance with industry standards. Overseeing the entire development process, including requirement gathering, technical planning, and execution to final deployment. Conducting thorough code reviews to maintain high-quality standards, promote best practices, and guide team members in achieving excellence. Maintaining comprehensive documentation for APIs, codebases, and processes to support scalability and team collaboration. Continuously research and integrate the latest technologies, ensuring the application architecture remains innovative and future-proof. Driving collaboration across teams to solve challenges, adapt to changing priorities, and ensure the successful delivery of projects from start to finish. Ideal candidate will have: 3+ years of experience in backend development or a similar role primarily with Nodejs. Advanced Proficiency in JavaScript, and Typescript, with experience in frameworks like Express.js or Nest.js. Strong grasp of asynchronous programming, event-driven architecture, and advanced concepts like streams and worker threads. In-depth experience with both SQL databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB), including query optimization and schema design. Expertise in building and consuming RESTful APIs and GraphQL services, with a solid understanding of API versioning and security best practices (e.g., OAuth2, JWT). Knowledge of microservices architecture and experience with tools like Docker, Kubernetes, and message brokers such as RabbitMQ or Kafka. Familiarity with front-end integration and technologies (e.g., HTML, CSS, JavaScript frameworks like React or angular.js). Proficiency in version control tools (e.g., Git) and familiarity with CI/CD pipelines using tools like Jenkins, and GitLab CI/CD. Hands-on experience with cloud platforms (e.g., AWS, GCP, or Azure), including deployment and monitoring services like EC2, CloudWatch, or Kubernetes Engine. Strong problem-solving skills, with experience in debugging and performance tuning of backend systems using tools like New Relic, Datadog, or ELK Stack. Understanding of testing frameworks (e.g., Mocha, Chai, Jest) and best practices for unit, integration, and performance testing Qualifications: Bachelors degree in technology
Posted 1 month ago
3.0 - 6.0 years
40 - 45 Lacs
Kochi, Kolkata, Bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
8.0 - 10.0 years
12 - 20 Lacs
Chennai
Hybrid
We are looking for a skilled Solutions Engineer with 8-10 years of experience in technical leadership, software architecture, and design. In this role, youll have the opportunity to design and develop scalable, reliable software solutions, utilizing a range of AWS services and open-source technologies. If youre passionate about creating impactful solutions and bridging technical and business needs, wed love to hear from you! Key Responsibilities: Lead the architecture and design of complex, scalable software solutions. Develop and implement robust applications using AWS Lambda, Python, EC2, S3, PHP, Laravel, Serverless Microservices, and open-source software. (Experience with Drupal is a plus.) Apply Agile methodologies and DevOps principles to ensure efficient development cycles. Optimize AWS cloud solutions to balance performance and cost. Collaborate with non-technical stakeholders to explain technical concepts effectively. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related technical field. 8-10 years of experience in a technical leadership role focused on software architecture and design. Proficiency in software development with strong experience in AWS cloud services and programming languages. Excellent problem-solving and critical-thinking skills. Effective communication and interpersonal skills.
Posted 1 month ago
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
5+ years of working experience in Python 4+ years of hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, Cloud formation, Docker, ECS, Fargate, and ECR. Very strong hands-on knowledge on using Python for integrations between systems through different data formats Expert in deploying and maintaining the applications in AWS and Hands on experience in Kinesis streams, Auto-scaling Team player with very good written and communication skills Strong problem solving and decision-making skills Ability to solve complex software system issues Collaborate with business and other teams to understand business requirements and work on the project deliverables. Participate in requirements gathering and understanding Design a solution based on available framework and code
Posted 1 month ago
11.0 - 20.0 years
25 - 40 Lacs
Hyderabad, Chennai, Greater Noida
Hybrid
Primary Skills Proficiency in AWS Services : Deep knowledge of EC2, S3, RDS, Lambda, VPC, IAM, AWS Event Bridge, AWS B2Bi (EDI Generator), CloudFormation, and more. Cloud Architecture Design : Ability to design scalable, resilient, and cost-optimized architectures. Networking & Connectivity: Understanding of VPC peering, Direct Connect, Route 53, and load balancing. Security & Compliance: Implementing IAM policies, encryption, KMS, and compliance frameworks like HIPAA or GDPR. Infrastructure as Code (IaC): Using tools like AWS CloudFormation or Terraform to automate deployments. DevOps Integration : Familiarity with CI/CD pipelines, AWS CodePipeline, and container orchestration (ECS, EKS). Cloud Migration : Planning and executing lift-and-shift or re-architecting strategies for cloud adoption. Monitoring & Optimization: Using CloudWatch, X-Ray, and Trusted Advisor for performance tuning and cost control. Secondary Skills Programming Skills : Python, Java, or Node.js for scripting and automation. Serverless Architecture: Designing with Lambda, API Gateway, and Step Functions. Cost Management: Understanding pricing models (On-Demand, Reserved, Spot) and using Cost Explorer. Disaster Recovery & High Availability: Multi-AZ deployments, backups, and failover strategies. Soft Skills: Communication, stakeholder management, and documentation. Team Collaboration: Working with DevOps, security, and development teams to align cloud goals. Certifications: AWS Certified Solutions Architect Associate/Professional, and optionally DevOps Engineer or Security Specialty
Posted 1 month ago
1.0 - 2.0 years
1 - 3 Lacs
Navi Mumbai
Work from Office
Job Title: CloudOps Engineer Department: Information Technology. Reporting line: Manager-IT Key Responsibilities Good understanding of the AWS/Azure cloud platform. Knowledge of Cloud Services, design, and configuration on enterprise systems. Good understanding of Cloud administration using Console & CLI. Understanding the needs of the business for defining Cloud system specifications. Understanding Architecture requirements and ensuring effective support activities. Familiar with Windows & Linux platforms . Understanding of EC2, VPC, ELB, S3, Cloud Watch, Event Bridge, SNS, IAM, Cloud Front, Lambda Worked and Manage three-tier highly scalable architecture with multiple DR/Production Client environment which includes Load Balancer , DNS, WAF, EC2, VPC , Security Groups, Auto Scaling and many other AWS Services to manage the client infrastructure. Managing three-tier highly scalable architecture including using Security Groups, Auto Scaling and Load Balancers . • Manage Bastion host to access instances in private and public subnet securely. Creating/Managing AMI/Snapshots/Volumes, upgrade/downgrade AWS resources (CPU, Memory, EBS). Creating Cloud watch alarm and monitoring various resources like EC2 instances and load balancer, configure Alerts to slack channels. Managing AD server to add, remove, modify, user access. Lifecycle policies to transfer data from one storage class to another. VPC peering between 2 VPCs also enabled VPC flow logs to monitor network related issues. Monthly patching on stage and production server. Creating or revoking user/role (AWS account) for on-boarding and off-boarding member.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
3.0 - 7.0 years
5 - 10 Lacs
Pune
Work from Office
Your Role : We're looking for a Senior Java Engineer who is passionate about technology and eager to solve complex problems. You'll join a diverse team of product managers, engineers, and designers, working together to build scalable, robust backend solutions in the accounting and finance space. What You'll Do : - Innovate and Build : Design, build, and maintain a high-performance platform for accounting and finance with an uncompromising focus on data integrity. - Develop and Deploy : Create solutions in Java/Spring Boot that provide seamless experiences for our users and deploy them into production. - Collaborate and Scale : Work closely with Product, Frontend, and DevOps teams to ensure our solutions are scalable and extensible. - Quality Focus : Ensure high product quality and user experience by addressing performance bottlenecks and debugging issues quickly. - Contribute and Grow : Participate in solution design and code reviews while evangelizing best practices and engineering hygiene. What We're Looking For : - Educational Background : Bachelor's degree in Computer Science, Information Technology, or a related field. - Technical Expertise : Proven experience in building highly scalable, high-performance applications. - Java Proficiency : Extensive hands-on experience with Java and Spring Boot, particularly developing API-first solutions using GraphQL and REST. - Solution Design and Architecture : Strong understanding of software design patterns, microservices architecture, and designing scalable solutions. Experience with API design and best practices. - Database Knowledge : Proficient in database design and management for both SQL (PostgreSQL/MySQL) and NoSQL (Redis/MongoDB/Cassandra) databases. - Cloud Experience : Hands-on experience with cloud platforms such as AWS, including services like AWS Lambda, EC2, ECS, S3, and RDS. Experience with serverless architectures is a plus. - Messaging Systems : Familiarity with message streaming/queuing systems such as Apache Kafka, RabbitMQ, AWS SQS/SNS/Kinesis. - DevOps Skills : Experience with CI/CD pipelines, containerization (Docker), and orchestration tools (Kubernetes). - Security Best Practices : Knowledge of security principles and best practices for building secure applications, including authentication, authorization, and encryption. - Problem-Solving Skills : Strong analytical skills with the ability to troubleshoot and resolve complex issues efficiently. - Performance Optimization : Experience in identifying and addressing performance bottlenecks within applications and infrastructure. - Collaboration and Communication : Excellent interpersonal skills, with the ability to work effectively in a team environment and communicate technical concepts clearly to non-technical stakeholders. - Agile Methodologies : Familiarity with Agile methodologies and experience working in an Agile/Scrum environment. - Continuous Learning : A proactive mindset for continuous learning and staying updated with the latest industry trends and technologies.
Posted 1 month ago
10.0 - 15.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Deep expertise in a wide range of AWS services, including: Compute: (EC2, Lambda, ECS, EKS), Storage: (S3, EFS, FSx), Databases: (RDS, DynamoDB, Aurora), Networking: (VPC, Route 53, CloudFront), Security: (IAM, KMS, GuardDuty, WAF), Monitoring: ( Required Candidate profile The role of an AWS Senior Architect is a senior-level position focused on designing, implementing, and managing robust cloud solutions on Amazon Web Services (AWS). Security: (IAM, KMS, GuardDuty,
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Bengaluru
Hybrid
We're seeking an experienced Azure and AWS Architect with a strong background in designing and implementing cloud infrastructure and automation services. The ideal candidate will have hands-on experience with Azure and AWS, as well as a deep understanding of SAP technologies and integration points with cloud services. Additionally, the candidate should have experience in cloud migration and operations capabilities, with relevant Microsoft and AWS certifications. Required Qualifications: - Bachelor's degree or higher in Computer Science, Information Systems, Engineering, Mathematics, or a related field - Minimum 10 years of infrastructure exp with at least 5 years of cloud architect experience - Hands-on experience with Azure and AWS, including creating EC2, EBS, and infrastructure using Terraform, Ansible, or Azure CLI - Detailed understanding of Azure and AWS security features, including audit and compliance - Experience with cloud migration, including developing detailed migration plans and executing migration activities - Strong understanding of Unix/Linux OS, Bash scripting, and IP networking - Azure Certification (Azure Certified Solution Associate/Professional) and/or AWS Cloud Certification (Architect/Professional) - Proficiency in SAP Basis administration and relevant technologies - Good understanding of SAP/DB components - Experience in cloud operations, including monitoring, logging, and incident management Desired Qualifications: - Programming skills in languages such as Java, Python, Scala, or Kotlin - Experience with distributed cross-platform applications, orchestration, and security - Azure/GCP certification - Experience with cloud cost optimization and management - Experience with containerization (Docker, Kubernetes) - Experience with DevOps tools (Jenkins, GitLab CI/CD)pipelines - Experience with agile project management methodologies (Scrum, Kanban) Key Responsibilities: - Design and implement cloud infrastructure and automation services on Azure and AWS - Develop and execute cloud migration plans, including resource allocation, timelines, and dependencies - Collaborate with technical teams and stakeholders to ensure successful cloud migrations - Ensure compliance with cloud security and audit requirements - Develop and maintain technical documentation for cloud infrastructure and services - Provide operational support for cloud-based systems, including monitoring, logging, and incident management Nice to Have: - Experience with cloud-based disaster recovery and business continuity planning - Experience with cloud-based security and compliance frameworks (e.g. HIPAA, PCI-DSS) - Experience with cloud-based data analytics and machine learning services (e.g. Azure Machine Learning, Amazon SageMaker)
Posted 1 month ago
5.0 - 8.0 years
5 - 9 Lacs
Kolkata
Work from Office
Seeking a results-driven Python Developer with expertise in API development, AWS services, SQL, and raw queries. Must have a basic grasp of backend architecture and system design principles.A Python Developer Lead plays a crucial role in the software development lifecycle, combining deep technical expertise in Python with strong leadership and project management skills. They are responsible for guiding a team of Python developers, ensuring the delivery of high-quality, scalable, and efficient software solutions. Job Summary: The Python Developer Lead will be responsible for overseeing the design, development, and deployment of robust, scalable, and performant Python applications. This role requires a blend of hands-on coding, architectural design, team leadership, and cross-functional collaboration. The Lead will mentor junior developers, establish best practices, ensure code quality, and contribute significantly to the overall technical strategy and success of our projects. Key Responsibilities: Technical Leadership & Architecture: Lead the design and development of complex Python-based systems, ensuring scalability, reliability, and maintainability. Define and enforce coding standards, design patterns, and architectural principles across the team. Conduct code reviews, provide constructive feedback, and ensure adherence to best practices. Stay abreast of emerging technologies, tools, and trends in the Python ecosystem and integrate relevant advancements. Team Management & Mentorship: Manage and mentor a team of Python developers, fostering their technical growth and professional development. Assign tasks, monitor progress, and provide guidance to ensure efficient project execution. Facilitate knowledge sharing and encourage a collaborative team environment. Participate in the hiring process for new team members. Software Development & Delivery: Develop, test, and deploy high-quality, efficient, and well-documented Python code for various applications and services. Work with cross-functional teams (Product, UI/UX, QA, DevOps) to translate business requirements into technical specifications and deliver effective solutions. Design and implement RESTful APIs, integrate with third-party services, and manage data pipelines. Troubleshoot and debug complex issues, ensuring low-latency and high-availability applications. Oversee the entire software development lifecycle, from conception to deployment and maintenance.
Posted 1 month ago
5.0 - 10.0 years
13 - 16 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js) Location: Remote (CET time zone overlap required) Notice Period: Immediate Type: Full-Time, Long-Term iSource Services is hiring for one of their USA based client for the position of Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js). About the Role - We are seeking an experienced Senior Full Stack Engineer for a long-term engagement on a functioning SaaS product in the manufacturing domain. The ideal candidate will have a strong foundation in both frontend and backend development and be comfortable working independently within a remote setup. This position provides an exciting opportunity to contribute to real-world AI and computer vision applications while working with a funded product that has already boarded a solid customer base. Project Overview A functional SaaS product operating in the manufacturing industry The platform has onboarded customers and secured initial funding Involvement in computer vision tasks integrated into real-world use cases Opportunity to work on a full-stack architecture and gain exposure to AI challenges Required Skills & Qualifications: Strong expertise in React.js (minimum 3 years in production-level projects) Advanced knowledge of Node.js and Nest.js (back-end frameworks) Practical experience with AWS (such as EC2, S3, Lambda, etc.) Relevant hands-on experience using Fabric.js for canvas-based UI Exposure to Python for computer vision (e.g., OpenCV, image processing) Strong version control practices using GitHub (GitHub profile required) Communication: Good command of English, both written and verbal Ability to communicate clearly in a remote environment Skills : - Full stack engineer,React.js,Node.js, nest.js,AWS, saas product ,AI application, computer vision, Full Stack Engineer, React.js, Node.js, Nest.js, AWS (EC2, S3, Lambda, etc.), Fabric.js (canvas-based UI), Python, GitHub, Frontend and backend development, SaaS product experience, AI application, computer vision.
Posted 1 month ago
8.0 - 12.0 years
30 - 40 Lacs
Gurugram
Hybrid
Responsibilities: Mentor engineers and build automation tools for cloud infrastructure provisioning using AWS (EC2, S3, CloudFront, RDS, Route 53, etc.) Design and implement CI/CD pipelines and integration solutions Manage infrastructure services such as monitoring, orchestration, and continuous delivery Collaborate with development teams to build scalable and reliable services Troubleshoot and resolve networking and DNS issues using tools like Route 53 Skills Required: 7+ years of experience with Unix/Linux; proficient in Python, Shell, or Bash scripting Strong hands-on experience with AWS core services (VPC, EC2, S3, Route 53, RDS, EKS) Proficient in automation/configuration tools such as Ansible, Jenkins Solid experience in CI/CD processes Expertise in monitoring tools (Nagios, Zabbix, Dynatrace) and experience with SIEM/ELK Knowledge of Serverless Architecture and Kubernetes Proficiency in IaaS tools like Terraform Good understanding of infrastructure security, audit, and compliance practices Strong background in scalability, reliability engineering, and Agile methodologies
Posted 1 month ago
3.0 - 4.0 years
20 - 25 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
3-4 years hands-on experience on AWS services, ideally SaaS in the cloud Experience developing solutions with code/scripting language must have Python experience (e.g, python, Node.js) Experience in creating and configuring AWS resources like API Gateway, CloudWatch, Cloud-Formation, EC2, Lambda, Amazon Connect, SNS, Athena, Glue, VPC etc.Sourcing & Screening US profiles Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad ,Remote
Posted 1 month ago
5.0 - 8.0 years
20 - 25 Lacs
Pune
Hybrid
Software Engineer Baner, Pune, Maharashtra Department Software & Automation Employee Type Permanent Experience Range 5 - 8 Years Qualification: Bachelor's or master's degree in computer science, IT, or related field. Roles & Responsibilities: Technical Role: Architect and build scalable data pipelines using AWS and Databricks. Integrate data from sensors (Cameras, Lidars, Radars). Deliver proof-of-concepts and support system improvements. Ensure data quality and scalable design in solutions. Strong Python, Databricks (SQL, PySpark, Workflows), and AWS skills. Solid leadership and mentoring ability. Agile development experience. Additional Skill: Good to Have: AWS/Databricks certifications. Experience with Infrastructure as Code (Terraform/CDK). Exposure to machine learning data workflows. Software Skills: Python Databricks (SQL, PySpark, Workflows) AWS (S3, EC2, Glue) Terraform/CDK (good to have)
Posted 1 month ago
6.0 - 8.0 years
18 - 20 Lacs
Pune
Work from Office
Roles and Responsibilities Experience of 6+ yrs as Mongo DB DBA Experience of migration to AWS is very much preferred, understanding of other DBs like SQL Server with AOAG, My SQL etc will be an added advantage. Requirements Expertise in Mongo DB with in-depth understanding, Should have worked on setting up the Mongo DB clusters for high availability and replication, Mongo DB distributed Ops Manager, Backups and Restore (Linux environment), Set up Listeners, Resolve connectivity issues and understanding of Firewall setup needs, Setting Encryption configuring certificates, understanding on AD based authenticated access and other DBA tasks. Experience of migration to AWS is very much preferred, understanding of other DBs like SQL Server with AOAG, My SQL etc will be an added advantage.
Posted 1 month ago
4.0 - 7.0 years
0 - 1 Lacs
Bengaluru
Hybrid
Job Requirements Job Description: AWS Developer Quest Global, a leading global technology and engineering services company, is seeking an experienced AWS Developer to join our team. As an AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications using Amazon Web Services (AWS) and Java development skills. Responsibilities: - Designing, developing, and deploying scalable and reliable cloud-based applications on AWS platform. - Collaborating with cross-functional teams to gather requirements and translate them into technical solutions. - Writing clean, efficient, and maintainable code using Java programming language. - Implementing best practices for security, scalability, and performance optimization. - Troubleshooting and resolving issues related to AWS infrastructure and applications. - Conducting code reviews and providing constructive feedback to ensure code quality. - Keeping up-to-date with the latest AWS services, tools, and best practices. Join our dynamic team at Quest Global and contribute to the development of cutting-edge cloud-based applications using AWS and Java. Apply now and take your career to new heights! Note: This job description is intended to provide a general overview of the position and does not encompass all the tasks and responsibilities that may be assigned to the role. Work Experience Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Minimum 5 years of experience as an AWS Developer or similar role. - Strong proficiency in Java programming language. - In-depth knowledge of AWS services such as EC2, S3, Lambda, RDS, DynamoDB, etc. - Experience with cloud-based application development and deployment. - Familiarity with DevOps practices and tools. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities.
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai, Malaysia, Malaysia
Work from Office
Responsibilities for Data Engineer Create and maintain the optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis of internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including MongoDB, Postgres and Cassandra, AWS Redshift, Snowflake Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, ETL, Glue, RDS, Redshift Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, etc. Knowledge of data pipelines and workflow management tools like Airflo Location: Chenna, India / Kuala Lumpur, Malaysia
Posted 1 month ago
6.0 - 11.0 years
15 - 22 Lacs
Chennai
Hybrid
Role: Full Stack Java Developer Year of Experience - 4yrs -15Yrs Location - Chennai What awaits you/ Job Profile Execute full software development life cycle (SDLC). Lead and manage the development team, including assigning tasks and monitoring progress. Code complex parts of the software, ensuring its robust and scalable. Produce specifications and determine operational feasibility. Integrate software components into a fully functional software system. Develop software verification plans and quality assurance procedures. Document and maintain software functionality. Troubleshoot, debug and upgrade existing systems. Comply with project plans and industry standards Ensure software is updated with latest features. Interface with clients on technical matters and software requirements. What should you bring along Perform requirement analyses. Conduct unit testing using automated unit test frameworks. Review the work of other developers and providing feedback. Use BMW coding standards and best practices to ensure quality. Develop and maintain BPMN workflows using Flowable. Integrate Flowable with existing enterprise systems and services. Optimize and troubleshoot performance issues in Java applications and Flowable processes. Strong understanding of BPMN 2.0, CMMN, and DMN standards Work with application development teams to design and build front end user interfaces. Maintain and review code written by other members of the team and outside consultants. Consult with business partners on requirements, organizing and synthesizing technical requirements and designs. Must have technical skill Strong experience with Java and Java frameworks Java 7 and 17, Microservice, Junit Excellent knowledge of relational databases, SQL and ORM technologies (JPA2, Hibernate) Experience developing web applications using popular web frameworks (JSF, Wicket, GWT, Spring MVC). Experience with RESTful APIs, Maven/Gradle, Git, SQL, Jenkins, Microservices architecture. Hands-on experience with Flowable BPM or similar BPM engines Experience with test-driven development. Proficiency in AWS services such as EC2, S3, RDS, Lambda, AWS Load Balancer, CloudWatch, Autoscaling, ECS, EKS, ECR. Proficiency in using GitHub Copilot or similar AI coding assistants Understand Agile methodologies and hands-on experience using Jira, Confluence or similar agile project management tools. PostGres, Oracle JavaFX and / or Eclipse RCP GitHub, CI/CD pipeline Apache Kafka UML Structural diagrams Cloud-onPrem Excellent communication and collaboration skills. Domain background in Financial Services/banking. Good to have technical skills Certification in Java Utilize GitHub Copilot to streamline coding, testing, and documentation tasks.
Posted 1 month ago
5.0 - 8.0 years
66 - 108 Lacs
Kolkata
Work from Office
Seeking a results-driven Python Developer with expertise in API development, AWS services, SQL, and raw queries. Must have a basic grasp of backend architecture and system design principles.
Posted 1 month ago
7.0 - 10.0 years
45 - 50 Lacs
Pune
Work from Office
Requirements: Our client is seeking a highly skilled Technical Project Manager (TPM) with strong hands-on experience in full-stack development and cloud infrastructure to lead the successful planning, execution, and delivery of technical projects. The ideal candidate will have a strong background in React, Java, Spring Boot, Python, and AWS, and will work closely with cross-functional teams including developers, QA, DevOps, and product stakeholders. As a TPM, you will play a critical role in bridging technical and business objectives, ensuring timelines, quality, and scalability across complex software projects. Responsibilities : - Own and drive the end-to-end lifecycle of technical projects-from initiation to deployment and post-launch support. - Collaborate with development teams and stakeholders to define project scope, goals, deliverables, and timelines. - Act as a hands-on contributor when needed, with the ability to guide and review code and architecture decisions. - Coordinate cross-functional teams across front-end (React), back-end (Java/Spring Boot, Python), and AWS cloud infrastructure. - Manage risk, change, and issue resolution in a fast-paced agile environment. - Ensure projects follow best practices around version control, CI/CD, testing, deployment, and monitoring. - Deliver detailed status updates, sprint reports, and retrospectives to leadership and stakeholders. Required Qualifications : - IIT /NIT graduate with 5+ years of experience in software engineering, with at least 2 years in a technical project management role. - Hands-on expertise in : React Java & Spring Boot Python AWS (EC2, S3, Lambda, CloudWatch, etc.) - Experience leading agile/Scrum teams with strong understanding of software development lifecycles. - Excellent communication, organizational, and interpersonal skills. Desired Profile : - Experience designing and managing Microservices architectures. - Familiarity with Kafka or other messaging systems. - Knowledge of CI/CD pipelines, deployment strategies, and application monitoring tools (e.g., Prometheus, Grafana, CloudWatch). - Experience with containerization tools like Docker and orchestration platforms like Kubernetes.
Posted 1 month ago
2.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
o Deploy applications on AWS using services such as EC2, ECS, S3, RDS, or Lambda o Implement CI/CD pipelines using GitHub Actions, Jenkins, or CodePipeline o Apply DevSecOps best practices including container security (Docker, ECR), infrastructure as code (Terraform), and runtime monitoring Team Collaboration & Agility o Participate in Agile ceremonies (stand-ups, sprint planning, retros) o Work closely with product, design, and AI engineers to build secure and intelligent systems
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough