Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
30 - 35 Lacs
Hyderabad, Coimbatore, Bengaluru
Work from Office
Job Overview: We are seeking an experienced Senior Developer with strong expertise in Python, Node.js, Azure, and a proven track record of migrating cloud applications from AWS to Azure. This role requires hands-on experience in PySpark, Databricks, and Azure data services such as ADF and Synapse Spark. The ideal candidate will lead end-to-end modernization and migration initiatives, code remediation, and deployment of serverless and microservices-based applications in Azure. Key Responsibilities: Lead the migration of Python and Node.js applications from AWS to Azure. Analyze legacy AWS architecture, source code, and cloud service dependencies to identify and implement code refactoring and remediation. Develop and modernize applications using PySpark (Python API), Databricks, ADF Mapping Data Flows, and Synapse Spark. Implement and deploy serverless solutions using Azure Functions, replacing AWS Lambda where applicable. Handle migration of storage and data connectors (e.g., S3 to Azure Blob, Confluent Kafka AWS S3 Sync Connector). Convert AWS SDK usage to corresponding Azure SDK implementations. Design and implement CI/CD pipelines, deployment scripts, and configuration for containerized applications using Kubernetes, Helm charts, App Services, APIM, and AKS. Perform unit testing, application troubleshooting, and support within Azure environments. Technical Skills: Must-Have: Python and Node.js development (8+ years total experience) PySpark (Python API) Azure Functions, AKS, App Services, Azure Blob Storage AWS Lambda to Azure Functions migration (Serverless architecture) AWS to Azure SDK conversion ADF (Azure Data Factory): Mapping Data Flows Synapse Spark, Azure Databricks Containerization: Docker, Kubernetes, Helm charts CI/CD Pipelines and deployment scripting Unit testing and application debugging on Azure Proven AWS to Azure application migration experience Nice-to-Have: Confluent Kafka AWS S3 Sync Connector APIM (Azure API Management) Experience working with both PaaS and Serverless Azure infrastructures Tech Stack Highlights: Programming: Python, Node.js, PySpark Cloud Platforms: AWS, Azure Data Services: Azure Blob Storage, ADF, Synapse Spark, Databricks Serverless: AWS Lambda, Azure Functions Migration Tools: AWS SDK to Azure SDK conversion DevOps: CI/CD, Azure DevOps, Helm, Kubernetes Other: App Services, APIM, Confluent Kafka Location: Hyderabad/ Bangalore/ Coimbatore/ Pune
Posted 1 week ago
4.0 - 8.0 years
20 - 32 Lacs
Hyderabad, Gurugram
Work from Office
• Designing, developing & deploying cloud-based data platforms using (AWS) • Integrating & processing structured & unstructured data from various sources • Troubleshooting data platform issues Watsapp (ANUJ - 8249759636) for more details.
Posted 1 week ago
10.0 - 14.0 years
15 - 20 Lacs
Mohali, Kharar, S.A.S. Nagar
Work from Office
Role & responsibilities We are looking for a highly experienced Software Architect with expertise in Node.js, Python, PHP-Laravel, and React.js to design and implement enterprise-grade, high-performance applications. The ideal candidate will be responsible for defining architectural solutions, analyzing the existing code base and helping the team to optimized the product. He will be accountable for ensuring scalability and optimizing systems to support high-intensity user loads. Key Responsibilities Define and develop scalable, high-availability system architectures for enterprise applications. Break down functional requirements into technical components . Define data models and system interactions. Provide presentation to client on technical solutions to be implemented Identify potential challenges and risks , along with mitigation strategies. Optimize backend and frontend components for low-latency, high-throughput performance. Develop strategies for efficient data processing, storage, and retrieval to handle large datasets Identify and resolve bottlenecks in application logic, database queries, and API response times. Architect solutions that support multi-system integration , ensuring seamless data exchange. Implement high-performance caching strategies (Redis, Memcached) and database optimizations. Work closely with software engineers, DevOps teams, and business stakeholders to align technical solutions with business needs. Provide technical mentorship to developers and conduct code and architecture reviews . Define best practices and standards for software development, testing, and deployment. Preferred Qualifications Backend: PHP -laravel, Node , Python Frontend: React.js, Angular Databases: PostgreSQL, MySQL, MongoDB, Cassandra, Elasticsearch Messaging & Streaming: Kafka, RabbitMQ, Pulsar Cloud & DevOps: AWS (Lambda, S3, RDS), Kubernetes, Docker Security & Compliance: OAuth 2.0, JWT, GDPR Experience in: Building enterprise-grade, high-performance applications with large user bases. Designing microservices, event-driven architectures , and serverless solutions. Previously worked on similar enterprise products and experience in handling a team of 20 developers Knowledge of new technologies B.TECH/BE is CSE from good tier college/university Worked previously in similar role and have extensive experience working in complex products Passionate about technology and innovation, continuously exploring new trends and contributing to the development of cutting-edge solutions Driven by a deep passion for software architecture and system design, continuously improving and implementing state-of-the-art solutions in complex environments
Posted 1 week ago
3.0 - 5.0 years
0 - 3 Lacs
Bengaluru
Work from Office
Mid-Level Python Engineer (Pipeline, Elasticsearch, DynamoDB) Location: Bangalore Experience: 3 + years Designation Member of Technical Staff Theres the typical job. Then theres a career at Alphastream. Where we'll challenge you to defy routine. To explore the far reaches of the possible. To travel unchartered paths. And to be a part of something far bigger than yourself. Because around here, changing the world just comes with the job description. Job Summary: We are looking for a Mid-Level Python Engineer with 35 years of experience to support and enhance our existing data pipelines and backend systems. The ideal candidate will have strong hands-on expertise in Python, Elasticsearch/OpenSearch, and DynamoDB, and will be responsible for maintaining and scaling our ETL workflows and search infrastructure. You should be comfortable working in a fast-paced environment, supporting production systems, and collaborating with cross-functional teams to build reliable and scalable data solutions. Responsibilities: Pipeline Development & Maintenance Build, extend, and maintain Python-based data ingestion and transformation pipelines (Airflow, AWS Lambda, or similar). Ensure pipelines are performant, well-tested, and monitored end-to-end. DynamoDB Engineering Design and optimize table schemas, GSIs, and data access patterns. Implement CRUD operations and batch processes using AWS SDK (boto3). Monitor capacity, handle throttling, and optimize cost/performance trade-offs Elasticsearch/Search Engineering Define index mappings, analyzers, and ingest pipelines. Write and optimize search queries (DSL, aggregations). Implement relevance tuning and performance optimizations. Integrate with Python applications (e.g., using official Elasticsearch client). Production Support & Monitoring Troubleshoot pipeline failures, search performance issues, and data inconsistencies. Instrument services with logging, metrics (CloudWatch, Prometheus), and alerting. • Drive continuous improvement: automate manual tasks, improve runbooks, and share learnings. Collaboration & Documentation Work closely with cross-functional teams to gather requirements and iterate on solutions. Write clear, concise documentation for pipeline workflows, data models, and search configurations Requirements: Experience: 35 years of professional software engineering experience. Minimum 2 years working with Python in production environments. Technical Skills: • Python: Strong skills in core language features, packaging, virtual environments, and testing frameworks (pytest/unittest). DynamoDB: Design, operation, performance tuning, and AWS SDK (boto3). Elasticsearch/OpenSearch: Index design, query DSL, performance tuning, and Python client integration. AWS: Familiarity with AWS services (Lambda, S3, IAM, CloudWatch). ETL/Orchestration: Experience with batch and streaming pipelines (Airflow, AWS Glue, Lambda + Kinesis, etc.). Soft Skills: • Strong problem-solving and debugging skills. • Clear verbal and written communicaion Self-starter who can work independently and collaboratively. Benefits: Competitive salary and benefits package. Opportunities for professional growth and career development. • Dynamic and collaborative work environment. Cutting-edge technologies and projects. If you are a talented Senior Full Stack Developer looking for an exciting opportunity to make a difference, we'd love to hear from you! Apply now to join our team Attitude: • Fail fast mentality and thrive to succeed based on the learnings Collaboration with Business stake holders, Product Management, Business Analysts, Development (UI & Engineering) teams, Domain experts, Senior Financial Analysts • Quick learner with minimal guidance Good to have: Good interpersonal skills Finance background Preferably from a Data driven company Who We AreAlphastream.ai envisions a dynamic future for the financial world, where innovation is propelled by state-of-the-art AI technology and enriched by a profound understanding of credit and fixedincome research. Our mission is to empower asset managers, research firms, hedge funds, banks, and investors with smarter, faster, and curated data. We provide accurate, timely information, analytics, and tools across simple to complex financial and non-financial data, enhancing decision-making. With a focus on bonds, loans,financials and sustainability, we offer near realtime data via APIs and PaaS (Platform as a Service) solutions that act as the bridge between our offerings and seamless workflow integration. To learn more about us: https://alphastream.ai/ What we offer "At Alphastream.ai we offer a dynamic and inclusive workplace where your skills are valued and your career can flourish. Enjoy competitive compensation, a comprehensive benefits package, and opportunities for professional growth. Immerse yourself in an innovative work environment, maintain a healthy work-life balance, and contribute to a diverse and inclusive culture. Join us to work with cutting-edge technology, and be part of a team that recognizes and rewards your achievements, all while fostering a fun and engaging workplace culture." Disclaimer: Alphastream.ai is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of all communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Experience in Modernizing applications to Container based platform using EKS, ECS, Fargat Proven experience on using DevOps tools during Modernization. Solid experience around No-SQL database. Should have used Orchestration engine like Kubernetes, Mesos Java8, spring boot, sql, Postgres DB and AWS Secondary Skills: React, redux, JavaScript Experience level knowledge on AWS Deployment Services, AWS beanstalk, AWS tools & SDK, AWS Cloud9, AWS CodeStar, AWS Command line interface etc and hands on experience on AWS ECS, AWS ECR, AWS EKS, AWS Fargate, AWS Lambda function, Elastic Chache, S3 objects, API Gateway, AWS Cloud Watch and AWS SNS Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications to Container based solutions. Should be expert in any of the programming languages like Java, .NET, Node .js, Python, Ruby, Angular .js Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 1 week ago
8.0 - 12.0 years
25 - 40 Lacs
Hyderabad
Remote
Job Title: Senior Backend Developer JavaScript & Node.js Location: Remote Job Type: Full-time Role: Individual Contributor Experience: Minimum Years Required 8+ Years Key Responsibilities: Develop, maintain, and optimize backend services using Node.js and JavaScript . Architect and deploy applications using AWS Lambda and Serverless Framework . Ensure efficient integration of AWS services such as Cognito, DynamoDB, RDS, ECS, ECR, EC2, IAM . Implement and manage containerized environments using Docker . Collaborate with cross-functional teams to ensure seamless application performance. Design and optimize database interactions, ensuring high availability and performance. Troubleshoot and resolve technical issues related to backend services. Implement best security practices forcloud-based applications. Required Skills & Experience: Strong expertise in Node.js & JavaScript . Deep understanding of AWS Lambda and Serverless Framework . Hands-on experience with Docker and container orchestration tools. Proven ability to work with AWS services (Cognito, DynamoDB, RDS, ECS, ECR, EC2, IAM). Strong knowledge of RESTful APIs and microservices architecture. Hands on in writing SQL Experience with CI/CD pipelines for efficient deployment. Ability to optimize backend performance and scalability. Solid understanding of security and compliance in cloud environments. Preferred Qualifications: Experience in monitoring and logging tools (AWS CloudWatch, AWS X-Ray). Familiarity with Terraform or Infrastructure-as-Code (IaC) concepts. Previous experience in high-traffic applications and scalable systems. Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.
Posted 1 week ago
2.0 - 6.0 years
7 - 17 Lacs
Noida
Hybrid
Must Have working experience in Minimum 2 years: Vue.js, Javascript, Typescript, Overall 2-7 years of experience in software development working in Linux environment HTML, CSS, Bootstrap Make use of tools: Jira, Confluence, Git, IntelliJ, OSX JSON, Agile development Good to have knowledge of React or Angular Node.js MySQL or MongoDB or DynamoDB Restful services AWS cloud services, Role & responsibilities Code - Eat - Repeat Other Details Excellent logical, problem solving skills BE/B.Tech/BCA/MCA from recognized institute with good academic score Mandatory 24/18 Months Retention for upto 5 Yr / 5Yr+ Year Experienced Candidates
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
Experience in Modernizing applications to Container based platform using EKS, ECS, Fargate Proven experience on using DevOps tools during Modernization. Solid experience around No-SQL database. Should have used Orchestration engine like Kubernetes, Mesos Java8, spring boot, sql, Postgres DB and AWS Secondary Skills: React, redux , JavaScript Experience level knowledge on AWS Deployment Services, AWS beanstalk, AWS tools & SDK, AWS Cloud9, AWS CodeStar, AWS Command line interface etc and hands on experience on AWS ECS, AWS ECR, AWS EKS, AWS Fargate, AWS Lambda function, Elastic Chache, S3 objects, API Gateway, AWS Cloud Watch and AWS SNS Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications to Container based solutions. Should be expert in any of the programming languages like Java, .NET, Node .js, Python, Ruby, Angular .js Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Gurugram
Work from Office
In one sentence We are seeking a highly skilled and adaptable Senior Python Developer to join our fast-paced and dynamic team. The ideal candidate is a hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. Youll also define APIs using AWS Lambda and containerised services on AWS ECS. Join us on an exciting journey where you'll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectureswhile continuously learning and growing alongside a passionate team. What will your job look like? Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases. Build and expose APIs using AWS Lambda and ECS-based microservices. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Senior SAP BTP DevOps Engineer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. You’ll make a difference by: Designing and implementing CI/CD pipelines using GitLab for SAP BTP and CAP applications Establishing infrastructure as code practices using Terraform/Terragrunt Automating deployment processes ensuring zero-downtime deployments Managing and optimizing cloud infrastructure on SAP BTP and AWS Implement monitoring, logging, and alerting solutions Providing technical leadership in DevOps best practices Collaborating with development teams to improve delivery processes Collaborating with infrastructure teams on automation of infrastructure provisioning. Collaborating with product teams to develop, maintain, and create new processes, procedures, and concepts. Maintaining tools and technologies utilized in DevOps processes. Troubleshooting DevOps systems and solve problems across platforms and application domains. Suggesting architectural, procedural, and systematic improvements based on empirical evidence. Taking strong initiatives and highly result oriented Job / Skills: 5-8 years of professional experience in software development and DevOps Strong expertise in: SAP BTP administration and deployment GitLab CI/CD pipelines Terraform/Terragrunt Infrastructure as Code (IaC) Cloud platforms (SAP BTP, AWS) PowerShell and CLI tools REST APIs Monitoring and logging tools Experience in implementing and operating AWS solutions and services with high availability, scalability, and performance Experience with building, deploying, and configuring SAP BTP, Python, Java and Angular applications Experience with CI/CD tools that build, package, and deploy applications (e.g., GitLab, Jenkins, Octopus Deploy, NuGet, Sonar) Experience with infrastructure automation tools preferred (e.g., Ansible, Terraform, PowerShell DSC). Experience in architecting serverless applications with AWS Lambda with Python Experience administering Windows Servers in production environments. Excellent command over English in written, spoken communication and strong presentation skills. Experience in Jira, Confluence or any other ALM tool will be an added advantage. Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens herewww.siemens.com/careers/digitalminds
Posted 1 week ago
10.0 - 13.0 years
25 - 40 Lacs
Gurugram
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total Experience 10+years. Strong working experience in Big Data Engineering. Expertise in AWS Glue (including Crawlers and Data Catalog). Hands-on experience in Python and PySpark for data engineering tasks. Strong working experience with Terraform and/or CloudFormation. Strong experience with Snowflake including data loading, transformations, and querying. Expertise in CI/CD pipelines, preferably using GitHub Actions. Strong working knowledge of AWS services like S3, SNS, Secret Manager, Athena, and Lambda. Familiarity with JIRA and GitHub for Agile project management and version control. Excellent problem-solving skills and ability to resolve complex functional issues independently. Strong documentation skills, including creation of configuration guides, test scripts, and user manuals. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients' requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad, Ahmedabad
Work from Office
5--7+ years experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices Sound knowledge of AWS Glue, AWS Lambda, Python, PySpark
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Must have: 5+ years of experience in designing, developing, and deploying AI/ML solutions, with at least 3+ years focused on AWS AI/ML services. Deep hands-on experience with Amazon SageMaker for building, training, tuning, and deploying ML models. Proven ability to work with AWS data services, such as: Amazon S3 (data storage),AWS Glue or AWS Data Wrangler (data processing), Amazon Athena or Redshift (querying/analytics) Familiarity with AWS AI services, like:Amazon Rekognition (computer vision), Amazon Comprehend (NLP), Amazon Transcribe/Polly (speech), Amazon Lex (chatbots) Experience building end-to-end ML pipelines using AWS-native tools or integrating with tools like Step Functions, Lambda, and CloudWatch for automation and monitoring. Solid understanding of model versioning, deployment strategies (real-time, batch, A/B testing), and model monitoring on AWS. Proficiency in Python for ML model development and deployment. Good to have: Hands-on experience with MLOps practices using AWS tools (e.g., SageMaker Pipelines, Model Registry, CodePipeline, CloudFormation). Familiarity with data lake architecture and tools like AWS Lake Formation. AWS certifications (e.g., AWS Certified Machine Learning Specialty, Solutions Architect – Associate/Professional).Experience with Application performance tuning
Posted 1 week ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
Requirements Back-end and API development experience in Python Front-end experience with React.js Experience with AWS technologies (RDS, Lambda, DynamoDB) Solid experience with SQL and NoSQL databases Familiarity with Agile methodologies (Scrum) Experience working on Linux-based platforms Good spoken English Nice to have Experience with Node.js or other programming languages Experience with Terraform.
Posted 1 week ago
6.0 - 8.0 years
8 - 11 Lacs
Hyderabad
Work from Office
What you will do In this vital role We are seeking a highly skilled and hands-on Senior Software Engineer Search to drive the development of intelligent, scalable search systems across our pharmaceutical organization. You'll work at the intersection of software engineering, AI, and life sciences to enable seamless access to structured and unstructured contentspanning research papers, clinical trial data, regulatory documents, and internal scientific knowledge. This is a high-impact role where your code directly accelerates innovation and decision-making in drug development and healthcare delivery Design, implement, and optimize search services using technologies such as Elasticsearch, OpenSearch, Solr, or vector search frameworks. Collaborate with data scientists and analysts to deliver data models and insights. Develop custom ranking algorithms, relevancy tuning, and semantic search capabilities tailored to scientific and medical content Support the development of intelligent search features like query understanding, question answering, summarization, and entity recognition Build and maintain robust, cloud-native APIs and backend services to support high-availability search infrastructure (e.g., AWS, GCP, Azure Implement CI/CD pipelines, observability, and monitoring for production-grade search systems Work closely with Product Owners, Tech Architect. Enable indexing of both structured (e.g., clinical trial metadata) and unstructured (e.g., PDFs, research papers) content Design & develop modern data management tools to curate our most important data sets, models and processes, while identifying areas for process automation and further efficiencies Expertise in programming languages such as Python, Java, React, typescript, or similar. Strong experience with data storage and processing technologies (e.g., Hadoop, Spark, Kafka, Airflow, SQL/NoSQL databases). Demonstrate strong initiative and ability to work with minimal supervision or direction Strong experience with cloud infrastructure (AWS, Azure, or GCP) and infrastructure as code like Terraform In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL v/s NoSQL modeling, milestoning, indexing, partitioning) Experience in REST and/or GraphQL Experience in creating Spark jobs for data transformation and aggregation Experience with distributed, multi-tiered systems, algorithms, and relational databases. Possesses strong rapid prototyping skills and can quickly translate concepts into working code Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software Analyze and understand the functional and technical requirements of applications Identify and resolve software bugs and performance issues Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Basic Qualifications: Degree in computer science & engineering preferred with 6-8 years of software development experience Proficient in Databricks, Data engineering, Python, Search algorithms using NLP/AI models, GCP Cloud services, GraphQL Hands-on experience with search technologies (Elasticsearch, Solr, OpenSearch, or Lucene). Hands on experience with Full Stack software development. Proficient in programming languages, Java, Python, Fast Python, Databricks/RDS, Data engineering, S3Buckets, ETL, Hadoop, Spark, airflow, AWS Lambda Experience with data streaming frameworks (Apache Kafka, Flink). Experience with cloud platforms (AWS, Azure, Google Cloud) and related services (e.g., S3, Redshift, Big Query, Databricks) Hands on experience with various cloud services, understand pros and cons of various cloud services in well architected cloud design principles Working knowledge of open-source tools such as AWS lambda. Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Preferred Qualifications: Experience in Python, Java, React, Fast Python, Typescript, JavaScript, CSS HTML is desirable Experienced with API integration, serverless, microservices architecture. Experience in Data bricks, PySpark, Spark, SQL, ETL, Kafka Solid understanding of data governance, data security, and data quality best practices Experience with Unit Testing, Building and Debugging the Code Experienced with AWSAzure Platform, Building and deploying the code Experience in vector database for large language models, Databricks or RDS Experience with DevOps CICD build and deployment pipeline Experience in Agile software development methodologies Experience in End-to-End testing Experience in additional Modern Database terminologies. Good to Have Skills Willingness to work on AI Applications Experience in MLOps, React, JavaScript, Java, GCP Search Engines Experience with popular large language models Experience with LangChain or LlamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global teams. High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
7.0 - 12.0 years
5 - 15 Lacs
Pune, Bengaluru
Work from Office
Role & responsibilities 7+ years of experience in Application development using python, AWS lambda, Micro services architecture, PostGre database. Deep understanding of application development using advance python programming and object oriented concepts & design patterns ORM tools knowledge, preferably SQL alchemy GraphQL API experience Working in SCRUM team, using GIT and branching mechanisms
Posted 1 week ago
12.0 - 18.0 years
35 - 45 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Summary We are seeking an experienced Amazon Connect Architect with 12 to 15 years of experience to design, develop and implement scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services You will play a key role in translating business needs into technical solutions and lead implementation across clients or business units Key Responsibilities Architect and design contact center solutions using Amazon Connect and AWS services like Lambda Lex DynamoDB S3 and CloudWatch Lead the endtoend implementation and configuration of Amazon Connect Integrate Amazon Connect with CRMs, Salesforce, ServiceNow etc, ticketing systems, and third-party tools Define call flows IVR designs, routing profiles and queue configurations Implement Contact Lens realtime metrics and historical reporting Collaborate with cross-functional teams, developers, business analysts project managers Create technical documentation diagrams and handoff materials Stay updated on AWS best practices and new Amazon Connect features Provide technical leadership and mentorship to development support teams Required Skills Proven experience designing and deploying Amazon Connect solutions Strong hands-on knowledge of AWS Lambda, IAM, S3, DynamoDB, Kinesis, and CloudFormation Experience with Amazon Lex and AIML for voice bots Proficiency in programming scripting JavaScript, Node.js Familiarity with CRM integrations especially Salesforce Service Cloud Voice Understanding of telephony concepts SIP DID ACD IVR CTI Experience with CICD pipelines and version control Git Strong documentation and communication skills Preferred Skills AWS Certified Solutions Architect or Amazon Connect accreditation
Posted 1 week ago
8.0 - 13.0 years
22 - 30 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 8+ years Support Model: 24x7 rotational Role Overview: Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation point for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills
Posted 1 week ago
3.0 - 8.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 3-5 years Support Model: 24x7 rotational Role Overview: Provide support on Amazon Connect-related incidents and user issues. Handle basic troubleshooting of voice, call routing, and UI-based configurations. Support change announcements and basic deployment activities. Coordinate with L2/L3 engineers for escalation. Maintain documentation and update knowledge base. Skills Required: Hands-on experience with Amazon Connect (basic flows, routing, and settings) Exposure to AWS Lambda, S3, DynamoDB Basic understanding of WebRTC and voice troubleshooting Familiar with CloudWatch, Connect Metrics Willingness to learn Salesforce integration. (Service Cloud Voice) Strong willingness to work in support model and take ownership Experience: 5-8 years Support Model: 24x7 rotational Role Overview: Provide L2 level support for Amazon Connect and associated AWS services. Address incidents and troubleshoot system or telephony-related issues. Support service delivery and ensure announced changes are implemented. Maintain SLAs and escalate where required. Contribute to documentation and improvement plans. Support deployment through CI/CD pipeline. Skills Required: Strong hands-on experience with Amazon Connect Working knowledge of Lambda, DynamoDB, S3 Good understanding of call flows, routing, and WebRTC troubleshooting Familiarity with CloudWatch, Connect Metrics, CI/CD Exposure to Salesforce integration helpful. (Service Cloud Voice) Ability to work independently with issue resolution Good communication and support handling
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
5.0 - 8.0 years
7 - 17 Lacs
Chennai
Work from Office
Hiring Now: Exciting Opportunity with a Leading MNC! Location: Chennai Position: . NET Fullstack Developer Experience: 5-8 years Joining: Immediate / 15 Days Preferred
Posted 1 week ago
8.0 - 12.0 years
20 - 30 Lacs
Chennai, Bengaluru
Hybrid
We are hiring for Sr. .NET Developer/Lead Python & AWS Developer 8+ years (Chennai) Preferably to join less than 15 days/Immediate!! About us , We are a rapidly growing NASSCOM 2009-listed IT-enabling company with footprint in four countries. For two decades now, we have provided our global clientele with a range of high-end IT services including migration support, system integration, and infrastructure management solutions. Equipped with two development centres located at Chennai, Kumaran Systems can be that one-stop-shop for every IT solution that your business demands. Kumaran has been ranked among the Top 500 global software companies (16th Annual Ranking by Software Magazine) and among the Top 200 Indian software companies (3rd Annual Ranking by Data quest). Website :: www.kumaran.com Job Description: Responsibilities: Required Skills & Experience 5+ years of hands-on experience in software development with strong expertise in .NET. & Python. Deep experience with AWS services, particularly Lambda, API Gateway, Step Functions, and Bedrock. Proven track record in API and service development, including RESTful and event-driven architectures. Practical experience with DevOps practices, especially using GitLab for CI/CD and automation. Familiarity with Docker for containerization and Kafka for event streaming and messaging. Solid understanding of AI concepts and their practical application in enterprise solutions. Ability to self-manage, prioritize tasks, and handle multiple projects in a fast-paced environment. Excellent communication skills and a proactive, positive attitude. Secondary Skills : AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with consuming LLM Models APIs (GenAI). Exposure to frontend development for rapid prototyping. Interested applicants share profiles to lakshmi.prasuna@kumaran.com
Posted 1 week ago
1.0 - 4.0 years
5 - 9 Lacs
Noida, Mohali
Work from Office
- Support the development of internal web applications and tools. - Help build and maintain backend services. - Contribute to frontend development using React.js or Vue.js. - Assist in setting up and managing cloud-based infrastructure.
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Key Skills Java Microservices, Spring boot, Kafka, AWS (ECS fargate container, S3, lamda, postgress/mongo/aurora, Redis cache, NLB, ALB, AWS Route 53), Terraform Experience in Java J2ee Spring boot Experience in Design Kubernetes AWS EKS EC2 is needed( Mandatory) Experience in AWS cloud monitoring tools like Datadog Cloud watch, Lambda is needed Experience with XACML Authorization policies Experience in NoSQL SQL database such as Cassandra Aurora Oracle Experience with Web Services SOA experience SOAP as well as Restful with JSON formats with Messaging Kafka Hands on with development and test automation tools or frameworks eg BDD and Cucumber has context menu. Interested candidates share your updated cv on:- Sanchit@mounttalent.com
Posted 1 week ago
3.0 - 7.0 years
4 - 9 Lacs
Chennai
Hybrid
Responsibilities : Coordinate, collaborate and communicate effectively with client Research and evaluate emerging AI tools and technologies relevant to our industry and project needs. Implement and integrate AI tools into existing workflows and systems to improve productivity and performance. Gather project requirements and prepare system requirements including workflow and architecture Expected to be up to date with industry-standard best practices/techniques related to Application development Should be able to handle project and share the best practices across and cross utilise the functions and requirements Take ownership of project tasks modules and contribute independently with minimal supervision To perform analysis of tasks/issues reported and provide technical solutions Candidates desired profile : Rich experience on .Net tools , Visual Studio and MVC architecture Thorough understanding of Asp.Net core including its core principles Expertise on project tools Like DevOps, JIRA, Project Management tools is required Experienced in Agile, iterative and parallel development model is desirable Knowledge on AWS or Azure or similar cloud environment knowledge is nice to have SaaS based Product management experience is an advantage Good exposure in managing ERP and CRM projects You can reach me at " hr@expsoltechs.com "
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France