Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
12 - 16 Lacs
Kochi
Work from Office
As Data Engineer, you wi deveop, maintain, evauate and test big data soutions. You wi be invoved in the deveopment of data soutions using Spark Framework with Python or Scaa on Hadoop and AWS Coud Data Patform Responsibiities: Experienced in buiding data pipeines to Ingest, process, and transform data from fies, streams and databases. Process the data with Spark, Python, PySpark, Scaa, and Hive, Hbase or other NoSQL databases on Coud Data Patforms (AWS) or HDFS Experienced in deveop efficient software code for mutipe use cases everaging Spark Framework / using Python or Scaa and Big Data technoogies for various use cases buit on the patform Experience in deveoping streaming pipeines Experience to work with Hadoop / AWS eco system components to impement scaabe soutions to meet the ever-increasing data voumes, using big data/coud technoogies Apache Spark, Kafka, any Coud computing etc Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum 4+ years of experience in Big Data technoogies with extensive data engineering experience in Spark / Python or Scaa ; Minimum 3 years of experience on Coud Data Patforms on AWS; Experience in AWS EMR / AWS Gue / DataBricks, AWS RedShift, DynamoDB Good to exceent SQL skis Exposure to streaming soutions and message brokers ike Kafka technoogies Preferred technica and professiona experience Certification in AWS and Data Bricks or Coudera Spark Certified deveopers
Posted 5 days ago
4.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills
Posted 5 days ago
6.0 - 11.0 years
8 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi We have Excellent opportunity with TOP MNC Company for Permanent position Skill: Java, , Microservices , Javascript , Springboot, • Experience NoSQL databases - Mongo DB Exp: 5 plus Years Location:Bangalore,Hyderabad,Pune,Kerala 5+ years of experience developing Backend, API applications/software • Expert working experience in Java, , Microservices , Javascript , Springboot, • Experience NoSQL databases - Mongo DB • Require experience and Strong understanding of entire Software Development Life Cycle (SDLC), Agile (Scrum), • Experience with web services (consuming or creating) with REST, MQTT, Web Sockets • Good experience with Micro-services architecture, Pub-sub model ,working on cloud architecture, QA automation, CI / CD pipelines ,application security and load testing ,3rd party integration(Dockers, Kubernetes) and management • Experience managing Cloud infrastructure (resources and service) in AWS, Azure and/or GCP • Strong knowledge of SOA, object-oriented programming, design patterns, multi-threaded application development • Experience in reporting and analytic, queuing and real-time streaming systems • Experience developing, maintaining and innovating large scale web or mobile applications • BE/ B.Tech / MCA / M. Tech in computer programming, computer science, or If you are interested kindly revert back with updated resume Thanks for applying Regards, Jamuna 9916995347 s.jamuna@randstaddigital.com
Posted 5 days ago
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad
Hybrid
We are currently hiring for a Senior Software Engineer role with strong experience in Node.js, AWS Services, and JavaScript/TypeScript. This is a high-priority position, and we are looking for candidates who can join immediately or with a very short notice period. Senior Software Engineer Required Experience 7+ Years Technical skills in most of the following areas: Node.js , ReactJS and Redux, Saga, GraphQL, REST APIs , HTML, CSS, CSS3, JavaScript , TypeScript & Serverless.com framework or (Knowledge or Experience with AWS SAM, Lambda , S3, CloudWatch & DynamoDB/NeptuneDB.) , Knowledge or Experience with Cassandra , Mysql Databases. 6 - 10 years of experience in Software Web Development using NodeJS, TypeScript/ JavaScript. Good understanding of Software Development life cycle, Requirements Gathering, Requirements Analysis, Execution and Defect tracking. Experience in building REST APIs and GraphQL API’s, web services using Node JS, Express JS/Apollo, AWS Lambda and API Gateway. Extensive knowledge of SOA principle, Design Patterns, Application, and integration architectures. Experience in Agile methodology with tools like JIRA, GIT, GITLAB, SVN, Bit Bucket as an active scrum member. Strong with Object Oriented Analysis & Design (OOAD). Strong experience on PaaS, IaaS cloud computing. Developing secure, high-performance Web APIs that others rely on. Good hands-on experience in Serverless frameworks, AWS JS SDK, AWS services like Lambda, SNS, SES, SQS, SSM, S3, EC2, IAM, CloudWatch, Kinesis and Cloud Formation. Solid understanding of any SQL/NoSQL Databases like SQL, DynamoDB, Neptune DB and AWS Time stream DB. Experience on TDD i.e.Unit test cases writing, coding standards.
Posted 5 days ago
3.0 - 8.0 years
25 - 40 Lacs
Hyderabad
Work from Office
1. Automation of Processes: Automate trading system deployments, configuration, and monitoring to minimize manual errors and ensure rapid, consistent updates across environments. Develop scripts and tools to automate repetitive tasks, such as environment provisioning, software deployments, and database updates, using tools like Ansible, Jenkins, or Terraform. 2. High-Frequency Trading (HFT) System Optimization: Optimize CI/CD pipelines for ultra-low latency and high-throughput trading systems to support continuous delivery of trading algorithms and infrastructure updates. Ensure that deployment and testing processes do not impact the performance of trading operations. 3. Infrastructure Management and Scalability: Managecloud and on-premises infrastructures tailored for trading environments, focusing on low latency, high availability, and failover strategies. UseInfrastructure as Code (IaC) to provision scalable and secure environments that can handle fluctuating loads typical in trading operations. 4. Monitoring and Real-Time Logging: Implement monitoring tools to track system performance, trade execution times, and infrastructure health in real-time. Setupsophisticated logging mechanisms for trade data, errors, and performance metrics, ensuring traceability and quick troubleshooting during incidents. 5. Security and Compliance: Integrate security best practices into the DevOps pipeline, including real-time security scans, vulnerability assessments, and access control tailored for financial data protection. Ensure that all systems comply with financial regulations such as GDPR, MiFID II, and SEC rules, including managing audit logs and data retention policies. 6. Disaster Recovery and High Availability: Design and maintain disaster recovery solutions to ensure continuity in trading operations during outages or data breaches. Implement redundancy and failover strategies to maintain trading platform uptime, minimizing the risk of costly downtimes. 7. Performance Optimization for Trading Systems: Fine-tune infrastructure and CI/CD pipelines to reduce deployment times and latency, crucial for real-time trading environments. Workonsystem performance to support the rapid execution of trades, data feeds, and order matching systems. 8. Incident Management and Troubleshooting: Rapidly respond to incidents affecting trading operations, performing root cause analysis and implementing corrective measures to prevent reoccurrence. Ensure detailed incident reporting and documentation to support regulatory requirements. 9. Configuration Management: Maintain configuration consistency across multiple environments (dev, test, prod) using tools like Puppet, Chef, or SaltStack. Ensure configurations meet the stringent security and performance standards required for trading platforms. 10. Collaboration with Development and Trading Teams: Workclosely with developers, quants, and traders to ensure smooth deployment of new trading algorithms and updates to trading platforms. Facilitate communication between development, trading desks, and compliance teams to ensure that changes are in line with business requirements and regulations. 11. Risk Management: Implement risk management controls within the DevOps pipeline to minimize the impact of potential system failures on trading operations. Workwith risk and compliance teams to ensure that deployment and infrastructure changes do not expose trading systems to unnecessary risks. 12. Cloud Services and Cost Optimization: Deploy, manage, and scale trading applications on cloud platforms like AWS, Azure, or Google Cloud, with a focus on minimizing costs without compromising performance. Utilize cloud-native services such as AWS Lambda or Azure Functions for event-driven processes in trading workflows. 13. Version Control and Code Management: Managethe versioning of trading algorithms and platform updates using Git or similar tools, ensuring traceability and quick rollback capabilities if issues arise. Establish rigorous code review processes to ensure that changes align with performance and security standards specific to trading systems.
Posted 5 days ago
2.0 - 5.0 years
1 - 7 Lacs
Kolkata
Work from Office
Responsibilities: * Should be able to learn new technologies quickly and implement as needed. * Should be able to develop applications in Golang and NestJs / Nodejs (Typescript/Javascript). * Having ability to develop react application is preferred.
Posted 5 days ago
6.0 - 11.0 years
15 - 25 Lacs
Hyderabad
Work from Office
6–9 years of hands-on exp in MEAN/MERN stacks Team handling exp is a must DevOps: CI/CD pipelines (Jenkins/GitLab), Docker, Kubernetes. AWS: Lambda, SQS, S3, EC2, CloudFormation. Exp upgrading Angular, Node.js, and MongoDB in production environments
Posted 5 days ago
4.0 - 6.0 years
72 - 96 Lacs
Ahmedabad
Work from Office
Responsibilities: * Lead technology strategy & roadmap * Ensure scalability, security & reliability * Collaborate with cross-functional teams on system design * Oversee tech team's delivery & optimization
Posted 6 days ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Hybrid
Job Duties and Responsibilities: We are looking for a self-starter to join our Data Engineering team. You will work in a fast-paced environment where you will get an opportunity to build and contribute to the full lifecycle development and maintenance of the data engineering platform. With the Data Engineering team you will get an opportunity to - Design and implement data engineering solutions that is scalable, reliable and secure on the Cloud environment Understand and translate business needs into data engineering solutions Build large scale data pipelines that can handle big data sets using distributed data processing techniques that supports the efforts of the data science and data application teams Partner with cross-functional stakeholder including Product managers, Architects, Data Quality engineers, Application and Quantitative Science end users to deliver engineering solutions Contribute to defining data governance across the data platform Basic Requirements: A minimum of a BS degree in computer science, software engineering, or related scientific discipline is desired 3+ years of work experience in building scalable and robust data engineering solutions Strong understanding of Object Oriented programming and proficiency with programming in Python (TDD) and Pyspark to build scalable algorithms 3+ years of experience in distributed computing and big data processing using the Apache Spark framework including Spark optimization techniques 2+ years of experience with Databricks, Delta tables, unity catalog, Delta Sharing, Delta live tables(DLT) and incremental data processing Experience with Delta lake, Unity Catalog Advanced SQL coding and query optimization experience including the ability to write analytical and nested queries 3+ years of experience in building scalable ETL/ ELT Data Pipelines on Databricks and AWS (EMR) 2+ Experience of orchestrating data pipelines using Apache Airflow/ MWAA Understanding and experience of AWS Services that include ADX, EC2, S3 3+ years of experience with data modeling techniques for structured/ unstructured datasets Experience with relational/columnar databases - Redshift, RDS and interactive querying services - Athena/ Redshift Spectrum Passion towards healthcare and improving patient outcomes Demonstrate analytical thinking with strong problem solving skills Stay on top of emerging technologies and posses willingness to learn. Bonus Experience (optional) Experience with Agile environment Experience operating in a CI/CD environment Experience building HTTP/REST APIs using popular frameworks Healthcare experience
Posted 6 days ago
10.0 - 15.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Key Responsibilities: Design, develop, and maintain backend services using Python and AWS serverless technologies. Implement event-driven architectures to ensure efficient and scalable solutions. Utilize Terraform for infrastructure as code to manage and provision AWS resources. Configure and manage AWS networking components to ensure secure and reliable communication between services. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code while following best practices. Troubleshoot and resolve issues in a timely manner. Stay up to date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Qualifications: 9 to 15 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Strong understanding of event-driven architecture and its implementation. Hands-on experience with Terraform for infrastructure as code. In-depth knowledge of AWS networking components and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with other programming languages and frameworks. Familiarity with CI/CD pipelines and DevOps practices.
Posted 6 days ago
5.0 - 10.0 years
12 - 15 Lacs
Mohali
Work from Office
Role Overview: The Enhancements & Automation Lead is responsible for driving AI-driven automation, optimizing workflows, and integrating intelligent solutions to improve the efficiency of Amazon Connect Managed Services. Key Responsibilities: Identify automation opportunities to streamline Amazon Connect operations. Design and implement AI-driven enhancements for call routing, sentiment analysis, and chatbots. Integrate third-party automation tools and AI models for intelligent ticketing and incident classification. Collaborate with NOC teams and developers to optimize IVR workflows and customer experience. Evaluate and recommend automation platforms, RPA (Robotic Process Automation), and AI frameworks . Develop custom scripts, serverless automation workflows, and self-healing mechanisms . Provide expert guidance on DevOps, CI/CD, and cloud automation best practices . Required Skills & Qualifications: 5+ years of experience in automation, AI/ML, or cloud-based service optimization . Hands-on experience with AWS Lambda, AI/ML services, Python, and API integrations . Strong understanding of RPA, low-code/no-code automation tools, and workflow orchestration . Proficiency in Amazon Connect architecture, IVR scripting, and customer interaction analytics . AWS certifications in Machine Learning or DevOps are highly preferred
Posted 6 days ago
5.0 - 10.0 years
12 - 15 Lacs
Chandigarh
Work from Office
Role Overview: The Enhancements & Automation Lead is responsible for driving AI-driven automation, optimizing workflows, and integrating intelligent solutions to improve the efficiency of Amazon Connect Managed Services. Key Responsibilities: Identify automation opportunities to streamline Amazon Connect operations. Design and implement AI-driven enhancements for call routing, sentiment analysis, and chatbots. Integrate third-party automation tools and AI models for intelligent ticketing and incident classification. Collaborate with NOC teams and developers to optimize IVR workflows and customer experience. Evaluate and recommend automation platforms, RPA (Robotic Process Automation), and AI frameworks . Develop custom scripts, serverless automation workflows, and self-healing mechanisms . Provide expert guidance on DevOps, CI/CD, and cloud automation best practices . Required Skills & Qualifications: 5+ years of experience in automation, AI/ML, or cloud-based service optimization . Hands-on experience with AWS Lambda, AI/ML services, Python, and API integrations . Strong understanding of RPA, low-code/no-code automation tools, and workflow orchestration . Proficiency in Amazon Connect architecture, IVR scripting, and customer interaction analytics . AWS certifications in Machine Learning or DevOps are highly preferred
Posted 6 days ago
9.0 - 13.0 years
32 - 40 Lacs
Ahmedabad
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 6 days ago
4.0 - 7.0 years
25 - 27 Lacs
Bengaluru
Remote
4+ YOE as a Data Engineer/Scientist, hands-on experience working on Data Warehousing, Data ingestion, Data processing, Data Lakes Must have strong development experience using Python. and SQL, understanding of data orchestration tools like Airflow Required Candidate profile Experience with data extraction techniques - CDC, batch-based, Debezium, Kafka Connect, AWS DMS, queuing/messaging systems - SQS, RabbitMQ, Kinesis, AWS, Data/ML - AWS Glue, MWAA, Athena, Redshift
Posted 6 days ago
10.0 - 15.0 years
0 - 3 Lacs
Chennai, Bengaluru, Delhi / NCR
Work from Office
Required Skill- AWS developer AGILE,AWS,AWS Lambda, C#, Performance Optimization, Restful APIs, Web Services Location - Any location Notice period- 0 to 90 days
Posted 6 days ago
6.0 - 10.0 years
16 - 30 Lacs
Bengaluru
Hybrid
We are looking for ONLY Immediate joiners so please apply accordingly. Current CTC: Expected CTC: Notice Period(If resigned, LWD): Years of Experience : 5-10 Years Mode of work : Hybrid, 2 days work from Office. Location: Bangalore, Bellandur Responsibilities Design, develop, test, and deploy high-quality Python applications. Write clean, maintainable, and efficient code following best practices. Develop RESTful APIs and integrate with third-party services. Work with databases (SQL & NoSQL) to design efficient data storage solutions. Implement security, authentication, and authorization mechanisms. Optimize application performance and scalability. Collaborate with cross-functional teams, including frontend developers and DevOps. Debug, troubleshoot, and resolve software issues. Automate repetitive tasks using scripts and tools. Requirements Experience: 5+ years of hands-on experience in Python development. Frameworks: Proficiency in Django, Flask, or Fast API. Database: Strong knowledge of MySQL/ any RDBMS, and MongoDB. APIs: Experience in developing RESTful APIs and working with API documentation tools like Swagger/Postman. Cloud & DevOps: Familiarity with AWS, Docker, Kubernetes, and CI/CD pipelines. Other services include API Gateway,
Posted 6 days ago
2.0 - 5.0 years
4 - 7 Lacs
Ahmedabad
Work from Office
Roles and Responsibility : Collaborate with stakeholders to understand business requirements and data needs. Translate business requirements into scalable and efficient data engineering solutions. Design, develop, and maintain data pipelines using AWS serverless technologies. Implement data modeling techniques to optimize data storage and retrieval processes. Develop and deploy data processing and transformation frameworks for real-time and batch processing. Ensure data pipelines are scalable, reliable, and performant for large-scale data sizes. Implement data documentation and observability tools and practices to monitor...
Posted 6 days ago
5.0 - 8.0 years
15 - 25 Lacs
Noida, Pune, Gurugram
Hybrid
Cloud Engineer Hybrid Noida / Gurugram Role & responsibilities Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about Being Your Best as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. Were a place where everyone can discover and be their best version. Job Description 1. Design and manage cloud-based systems on AWS. 2. Develop and maintain backend services and APIs using Java. 3. Basic knowledge on SQL and able to write SQL queries. 4. Good hands on Docker file and multistage docker 5. Implement containerization using Docker and orchestration with ECS/Kubernetes. 6. Monitor and troubleshoot cloud infrastructure and application performance. 7. Collaborate with cross-functional teams to integrate systems seamlessly. 8. Document system architecture, configurations, and operational procedures. Need Strong Hands-on Knowledge: * ECS, ECR, NLB, ALB, ACM, IAM, S3, Lambda, RDS, KMS, API Gateway, Cognito, CloudFormation. Good to Have: * Experience with AWS CDK for infrastructure as code. * AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). * Pyhton Mandatory Competencies Cloud - AWS Cloud - AWS Lambda Beh - Communication Database PostgreSQL Preferred candidate profile
Posted 6 days ago
6.0 - 9.0 years
15 - 30 Lacs
Chennai
Work from Office
Roles and Responsibilities : Design, develop, test, deploy and maintain scalable Python applications using Fast API framework. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Develop unit tests for microservices architecture using PyTest and ensure high-quality code delivery through continuous integration pipelines. Troubleshoot issues in production environments on AWS cloud platform. Job Requirements : 6-9 years of experience in software development with expertise in Python programming language. Strong understanding of Fast API framework for building web services. Experience with AWS Lambda functions for serverless computing and API Gateway for building RESTful APIs. Proficiency in writing unit tests using PyTest.
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Hyderabad, Bengaluru
Hybrid
Key Skills & Responsibilities Hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, and SQS. Strong data engineering expertise on AWS, with proficiency in Python, PySpark, and SQL. Experience in batch job scheduling and managing data dependencies across pipelines. Familiarity with data processing tools such as Apache Spark and Airflow. Ability to automate repetitive tasks and build reusable frameworks for improved efficiency. Provide RunOps DevOps support, and manage the ongoing operation and monitoring of data services. Ensure high performance, scalability, and reliability of data workflows in cloud environments. Skills: aws,s3,glue,apache spark,lambda,airflow,sql,s3, lambda, glue, api gateway, and sqs,api gateway,pyspark,sqs,python,devops support
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary...More... Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications Container based solutions. Should be expert in any of the programming languages like Java, .NET...More Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 1 week ago
5.0 - 10.0 years
9 - 18 Lacs
Mumbai
Work from Office
About the Role: We are hiring a Backend Developer with strong expertise in Node.js (Express) and AWS serverless infrastructure to join our high-impact mobile application development team. This is a hands-on development role where you will be responsible for architecting and implementing robust APIs, managing integrations with wearables and third-party health data platforms, and enabling AI/ML-powered features through seamless data pipelines. You will be a key player in delivering a cutting-edge mobile solution that helps users monitor and improve their health while also managing carers and accessing essential community servicesall in one place. Key Responsibilities: Design, build, and maintain scalable and secure RESTful APIs using Node.js (Express) hosted on AWS Lambda or EC2 . Integrate data ingestion from smart devices (Apple HealthKit, Google Fit, Fitbit, etc.), healthcare platforms, and manual user inputs. Work with AWS services including Lambda, API Gateway, S3, Cognito, and CloudWatch to develop and deploy backend infrastructure. Implement real-time data sync, user authentication, and role-based access control. Develop and manage database schemas using Amazon Aurora (PostgreSQL) or DynamoDB for optimal performance and scalability. Collaborate with the AI/ML team to enable integration with Amazon SageMaker for predictive analytics and personalised health insights. Support OCR integrations using Amazon Textract and Rekognition to process uploaded health documents. Ensure high levels of data security and compliance , especially for sensitive healthcare data. Work closely with the Flutter frontend developers , Technical Architect , and DevOps team to ensure seamless integration and deployment. Participate in code reviews, maintain documentation, and contribute to continuous integration and deployment (CI/CD) pipelines. Initial Phase Deliverables: Phase 1: API integrations for wearable data, manual data input, and external healthcare sources Core backend for user dashboard, carer management, and service booking Emergency assistance backend and real-time alerts Secure deployment using AWS infrastructure Phase 2: AI/ML integration for intelligent insights and recommendations Life Expectancy Indicator logic and data aggregation OCR pipeline and document management features Ongoing scalability and performance enhancements Required Skills & Experience: 35 years of experience in Node.js (Express) backend development Solid understanding of AWS architecture and services like Lambda, API Gateway, S3, CloudWatch, and EC2 Strong experience with REST APIs , serverless frameworks, and API security (OAuth2/JWT) Familiarity with PostgreSQL (Amazon Aurora) or DynamoDB database design and optimisation Hands-on experience integrating third-party APIs , especially health-related SDKs and platforms Familiarity with mobile-first backend development, working closely with Flutter or mobile frontend teams Excellent debugging, performance tuning, and documentation skills Strong understanding of asynchronous programming, error handling, and data validation Nice to Have: Experience working with OCR (Textract, Rekognition) or AI/ML services (SageMaker) Prior work in healthcare tech , caregiver platforms , or IoT integrations Exposure to WebSockets , push notifications, and real-time service booking systems Knowledge of DevOps tools for deployment, CI/CD, and monitoring (GitHub Actions, CodePipeline, etc.) AWS Certification (Developer Associate / Solutions Architect) What We Offer: Opportunity to work on an innovative health-focused mobile platform impacting real lives A dynamic and collaborative startup environment Competitive salary and performance-based incentives Growth opportunities into DevOps, AI integration, or technical leadership Work with cutting-edge AWS technology stack and AI-driven architecture How to Apply: Send your resume and a short cover letter to careers@a1disabilitysupportcare.com.au Subject: Backend Developer – Mobile Health App – Mumbai (Very Important: You NEED TO copy-paste this subject above in the subject field of the email)
Posted 1 week ago
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Mandatory Skills AWS, Databricks Detailed job description - Skill Set: Looking for 10+ Y / highly experienced and deeply hands-on Data Architect to lead the design, build, and optimization of our data platforms on AWS and Databricks. This role requires a strong blend of architectural vision and direct implementation expertise, ensuring scalable, secure, and performant data solutions from concept to production. Strong hand on exp in data engineering/architecture, hands-on architectural and implementation experience on AWS and Databricks, Schema modeling . AWS: Deep hands-on expertise with key AWS data services and infrastructure. Databricks: Expert-level hands-on development with Databricks (Spark SQL, PySpark), Delta Lake, and Unity Catalog. Coding: Exceptional proficiency in Python , Pyspark , Spark , AWS Services and SQL. Architectural: Strong data modeling and architectural design skills with a focus on practical implementation. Preferred: AWS/Databricks certifications, experience with streaming technologies, and other data tools. Design & Build: Lead and personally execute the design, development, and deployment of complex data architectures and pipelines on AWS (S3, Glue, Lambda, Redshift, etc.) and Databricks (PySpark/Spark SQL, Delta Lake, Unity Catalog). Databricks Expertise: Own the hands-on development, optimization, and performance tuning of Databricks jobs, clusters, and notebooks.
Posted 1 week ago
7.0 - 10.0 years
8 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Mandatory Skills AWS, Python Detailed job description - Skill Set: Hands-on Experience with programming languages such as Python mandatorily. Thorough understanding of AWS from a data engineering and tools standpoint. Experience in another cloud is also beneficial. Experience in AWS Glue, Spark, and Python with Airflow for designing and developing data pipelines. Expertise in Informatica Cloud is advantageous Data Modeling: Advanced/Intermediate Data Modeling skills (Master/Ref/ODS/DW/DM) to enable Analytics on the Platform. Traditional data warehousing and ETL skillset, including strong SQL and PL/SQL skills. Experience with inbound and outbound integrations on the cloud platform Design and development of Data APIs (Python, Flask/FastAPI) to expose data on the platform Partner with SA to identify data inputs and related data sources, review sample data, identify gaps, and perform quality checks. Experience loading and querying cloud-hosted databases like Redshift, Snowflake, and BigQuery. Preferred - Knowledge of system-to-system integration, messaging/queuing, and managed file transfer. Preferred - Building and maintaining REST APIs, ensuring security and scalability Preferred - DevOps/DataOps: Experience with Infrastructure as Code, setting up CI/CD pipelines. Preferred - Building real-time streaming data ingestion
Posted 1 week ago
6.0 - 9.0 years
10 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Mandatory skills* AWS, KAFKA, ETL, Glue, Lamda, Tech stack experience Required Phyton, SQL
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France