Jobs
Interviews

87 Test-Driven Development Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 10 years

10 - 14 Lacs

Gurugram

Work from Office

Practice Overview Practice: Data and Analytics (DNA) - Analytics Consulting The Role and Responsibilities We have open positions ranging from Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence

Posted 4 months ago

Apply

15 - 20 years

45 - 50 Lacs

Hyderabad

Work from Office

About the Role : Director, Data Engineering Manager (Solution Architect) S&P Global Ratings is looking for a Java, Angular full stack solution Architect to join Ingestion Pipelines Engineering team within Data Services group, a team of data and technology professionals who define and execute the strategic data roadmap for S&P Global Ratings. The successful candidate will participate in the design and build of S&P Ratings cloud based analytics platform to help develop and deploy advanced analytics/machine learning solutions. The Team : Join the Rating Organizations Data Services Product Engineering Team, known for its expertise in critical data domains and technology stacks. This team values knowledge sharing, collaboration, and a unified strategy to build S&P Ratings' next-gen analytics platform. Members provide leadership, innovation, and articulate business value, contributing to a unique opportunity to evolve the platform. Responsibilities and Impact : Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Lead and mentor a team of engineers, fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. What Were Looking For: Basic Required Qualifications : Bachelor's degree in Computer Science, Information Systems or Engineering is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development 15+ years of experience with 8+ years designing enterprise products, modern data stacks and analytics platforms 8+ years of hands-on experience contributing to application architecture & designs, proven software/enterprise integration design patterns and full-stack knowledge including modern distributed front end and back-end technology stacks 8+ years full stack development experience in modern web development technologies, Java/J2EE, UI frameworks like Angular, React, data lake systems like Databricks using AWS cloud technologies and PySpark, SQL, Oracle, NoSQL Databases like MongoDB Thorough understanding of distributed computing Experience designing transactional/data warehouse/data lake and data integrations with big data eco system leveraging AWS cloud technologies Passionate, smart, and articulate developer Exp. with frameworks such as Angular, React JS, Durandaljs, Knockoutjs, React and Bootstrap.js Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Additional Preferred Qualifications : Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor

Posted 4 months ago

Apply

6 - 10 years

8 - 12 Lacs

Hyderabad, Ahmedabad

Work from Office

About the Role: Grade Level (for internal use): 11 The Team: We are looking for highly motivated, enthusiastic and skilled software engineer with experience in architecting and building solutions to join an agile scrum team developing technology solutions for S&P Global Market Intelligence. The team is responsible for developing and ingesting various datasets into the product platforms utilizing latest technologies. The Impact: Contribute significantly to the growth of the firm by: Developing innovative functionality in existing and new products Supporting and maintaining high revenue products Achieve the above intelligently and economically using best practices. Whats in it for you: Build a career with a global company. Work on products that fuels the global financial markets. Grow and improve your skills by working on enterprise level products and new technologies. Responsibilities: Architect, design, and implement software related projects. Perform analysis and articulate solutions. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Collaborate effectively with technical and non-technical stakeholders Active participation in all scrum ceremonies following Agile principles and best practices What Were Looking For: Basic Qualifications: Bachelor's degree in computer science or equivalent 6 to 10 years experience in application development Willingness to learn and apply new technologies. Excellent communication skills are essential, with strong verbal and writing proficiencies. Good work ethic, self-starter, and results-oriented Excellent problem-solving & troubleshooting skills Ability to manage multiple priorities efficiently and effectively within specific timeframes Strong hands-on development experience in C#, python Strong hands on experience in building large scale solutions using big data technology stack like Spark and microservice architecture and tools like Docker and Kubernetes. Experience in conducting application design and code reviews Able to demonstrate strong OOP skills Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience implementing Web Services Have experience working with SQL Server. Ability to write stored procedures, triggers, performance tuning etc Experience working in cloud computing environments such as AWS Preferred Qualifications: Experience with large scale messaging systems such as Kafka is a plus Experience working with Big data technologies like Elastic Search, Spark is a plus Experience working with Snowflake is a plus Experience with Linux based environment is a plus

Posted 4 months ago

Apply

4 - 7 years

8 - 12 Lacs

Pune

Hybrid

EDUCATION AND EXP A professional degree in Computer Science from a reputable institution, backed by a consistent academic record. A knack for problem-solving, data structures, and algorithms. Proficiency in ElasticSearch. 5-10 years of hands-on development experience, primarily in building products for large enterprises. Exceptional communication skills. Mastery in Java programming; familiarity with Python is a plus. Experience with Spring Boot. Practical knowledge of one or more cloud-based technologies (e.g., ElasticSearch, Storm, Hazelcast, MongoDB, Ceph, Kafka) is highly desirable. Expertise in building concurrent and/or parallelized, highly performant scalable applications. A track record of identifying and addressing complex issues in scalable deployments. Exposure to Service-Oriented Architecture (SOA) and Test-Driven Development (TDD) is an added advantage. ROLES & RESPONSIBILITIES: Dive deep into technical aspects (Analysis, Design & Implementation) as required. Take complete ownership of features within the product. Engage in debates and detailed discussions about functional and non-functional requirements with our Product Management team. Collaborate with the team to design solutions, seeking stakeholder input before implementation. Create essential artifacts such as functional specifications and detailed designs for your assigned features. Implement intricate features with an unwavering commitment to quality, following the Test- Driven Development (TDD) process. Maintain open lines of communication, promptly reporting risks and progress to your supervising manager. Share your expertise and mentor team members. Provide support by troubleshooting and creating Root Cause Analysis (RCA) for production issues, subsequently working on short-term and long-term solutions.

Posted 4 months ago

Apply

3 - 5 years

5 - 8 Lacs

Bengaluru

Work from Office

Job Description Job Title: Design, develop, deploy, and operate distributed, scalable, multi-tenant Cloud Services on Oracle Cloud Infrastructure (OCI) for Oracle Health, integrating with FHIR-based healthcare systems and leveraging large language models (LLMs) for advanced data processing and decision support. Brief Summary: Oracle Integration Cloud provides a comprehensive no-code modern platform for designing, deploying, and running enterprise-scale integrations, spanning SaaS and on-premises applications as well as heterogeneous information systems. With the integration of FHIR standards for healthcare data and leveraging the platforms AI/ML capabilities, including large language models (LLMs), pre-built accelerators, and process automation, Oracle dramatically simplifies healthcare application integration, particularly for Oracle Health. Team and Product Description: Oracle Integration Cloud (OIC) is a cloud-managed service running natively on Oracle Cloud Infrastructure (OCI). The Connectivity component is a critical part of OIC, providing integration with a wide array of data sources, including healthcare data systems that follow the FHIR (Fast Healthcare Interoperability Resources) standard. This component enables seamless connectivity across Oracle and non-Oracle SaaS and enterprise applications, databases, messaging systems, and generic protocols like REST, SOAP, OData, FTP, GraphQL, and JDBC, whether in public, private, or on-premises networks.In this role, you will work closely with the healthcare systems of Oracle Health, utilizing FHIR mappings and advanced AI/LLM capabilities to ensure smooth integration of clinical data with enterprise applications, supporting decision-making and predictive analytics in healthcare.The OIC Connectivity development team is responsible for gathering requirements, system design, architecture, implementation, and support for cloud-native integrations, including FHIR-based applications and LLM-driven services. Our team builds the future of cloud integration with a focus on healthcare interoperability, solving data and application integration challenges across various sectors, including healthcare. Description of the Role: Design and develop cloud-native enterprise software products and services, focusing on distributed, scalable, fault-tolerant, and multi-tenant cloud services for healthcare applications using FHIR and LLM-based models for natural language processing (NLP). Build FHIR-compliant integrations with Oracle Healths healthcare data systems, ensuring adherence to FHIR standards in data exchange, patient record management, and clinical workflows, while integrating LLMs to provide contextual insights and automated processing of clinical data. Develop healthcare cloud applications following microservices and twelve-factor application principles, optimizing for data integration with FHIR-based services and LLMs for advanced text-based analysis and healthcare recommendations. Focus on API-first design using OpenAPI, Swagger, and test-driven development, ensuring integration with FHIR-based healthcare data through RESTful APIs, and leveraging LLMs for real-time patient data analytics. Code in Java, leveraging RESTful APIs, microservices, Docker, and Kubernetes, with integration into healthcare data systems at Oracle Health that use FHIR, and incorporating LLMs for scalable NLP tasks, such as summarizing patient information or automating documentation. Work with prominent healthcare APIs and other systems like AWS, Microsoft Azure, GCP, and Salesforce to facilitate healthcare data exchange, with programming languages like Java, Go, Python, JavaScript, and NodeJS, integrating AI-driven solutions and LLMs for intelligent data extraction. Implement message interchange formats such as JSON/JSON Schema, XML, Avro, and FHIR resources, while integrating LLM-based AI models to enhance decision-making, predictive analytics, and patient engagement. Collaborate with JavaScript frameworks like OJET, ReactJS, and AngularJS to create dynamic, healthcare-driven user interfaces, integrating with FHIR-based patient data systems and enhancing user experience with LLM-driven insights and contextual support. Provide technical evaluations, optimize applications, and implement best practices in developing cloud-native healthcare applications with FHIR and LLM capabilities for Oracle Health. Create reusable solutions that can accelerate the creation of healthcare applications using FHIR and LLMs, streamlining the integration with EHR and clinical systems, and delivering smarter healthcare insights. Preferred Qualifications: B.E./B.Tech/M.S./M.Tech in Computer Science. 3+ years of experience in cloud service development. Experience with LLMs, natural language processing (NLP), and integrating AI models into enterprise applications. Knowledge of healthcare standards like FHIR and experience in integrating healthcare data across enterprise applications would be beneficial. Strong ability to innovate and excel in a fast-paced environment, especially in the healthcare, AI, and cloud integration space.In this role, youll be contributing to healthcare innovation at Oracle Health, focusing on integrating critical clinical data using FHIR, while leveraging LLM-based AI solutions and the power of Oracle Cloud Infrastructure to drive scalable and reliable healthcare solutions. Career Level - IC3 Responsibilities Key Responsibilities: Design and Development: Create and implement scalable, multi-tenant cloud services on Oracle Cloud Infrastructure (OCI) tailored for Oracle Health. Integration: Develop and integrate healthcare systems using FHIR standards to ensure smooth data exchange in clinical workflows. Cloud-Native Applications: Build cloud-native applications and ensure compliance with FHIR standards for healthcare integrations. Decision-Making and Analytics: Leverage large language models (LLMs) and AI-driven solutions to enhance decision-making processes, predictive analytics, and automation in healthcare. API Development: Design and develop RESTful APIs to facilitate seamless integration and interaction with various healthcare data formats. Microservices Architecture: Implement and maintain a robust microservices architecture to support scalable and efficient cloud services. Collaborative Efforts: Work closely with cross-functional teams to create reusable, FHIR-based solutions focusing on healthcare interoperability and AI integration. Operational Excellence: Monitor, operate, and maintain cloud services to ensure optimal performance, reliability, and scalability. Compliance and Standards: Ensure all cloud services and integrations comply with industry standards and regulations, particularly in healthcare.

Posted 4 months ago

Apply

5.0 - 7.0 years

8 - 14 Lacs

hyderabad

Work from Office

About the Role : We are seeking a highly motivated and experienced Senior Backend Engineer to join our growing team. You will play a key role in building, improving, and maintaining our software solutions while effectively collaborating with US-based clients. Your strong technical skills will be complemented by your excellent communication abilities, allowing you to bridge the gap between technical development and client needs. Responsibilities : - Design, develop, and implement robust and scalable backend features using Node.js and Typescript. - Collaborate with product managers and designers to understand client requirements and translate them into technical specifications. - Advocate for and implement test-driven development (TDD) practices to ensure code quality, maintainability, and testability. - Write clean, maintainable, and well-documented code adhering to best practices. - Troubleshoot and resolve complex technical problems related to the backend. - Continuously learn and apply new technologies, methodologies, and languages. - Effectively communicate technical concepts, progress updates, and solutions to both technical and non-technical audiences, including US-based clients. - Identify areas for improvement in development efficiency and propose solutions to reduce "technical debt." - Work independently on smaller features and collaboratively with the team on larger projects. Technical Skills & Experience : - 5+ years of experience in backend development with a strong track record of delivering high-quality software. - Proficiency in Node.js and Typescript. - Experience with relational databases (e.g., Postgres, MySQL) and/or NoSQL databases (e.g. MongoDB). - Working knowledge of containerization technologies like Docker and Docker Compose. - Experience with CI/CD pipelines using tools like Jenkins or GitHub Actions. - Familiarity with unit testing frameworks like Jest or similar tools. - Understanding of Agile methodologies for software development. - Experience with AWS Cloud, preferably with serverless architecture concepts (a plus).

Posted Date not available

Apply

6.0 - 10.0 years

8 - 18 Lacs

kolkata

Work from Office

Lead team of Blockchain Developers, write & review high-quality codes. Design & build Blockchain framework, accelerators & assets. Design & deploy smart contracts on Ethereum & Layer 2 sidechains. Collaborate on decentralized finance projects & TDD. Required Candidate profile B.Tech/ MCA 6+ Yrs exp in Solidity smart contract prog. Hands-on blockchain APIs, Ethereum standards (ERC-20, ERC-721) & De-Fi projects, Docker, Kubernetes, Node.js, Open-source tools, React/ Angular.

Posted Date not available

Apply

12.0 - 15.0 years

12 - 16 Lacs

bengaluru

Work from Office

The role of the Supply Chain Management (SCM)-Senior Engineering Manager As a Senior Engineering Manager on the SCM team, you will be a key driver and enabler in a fundamental shift in how we design and build software throughout their lifecycle. The ideal candidate has experience building and leading development teams and can help drive best practices in the software delivery lifecycle. Key Responsibilities Own deliverables end to end per plan working with multiple stakeholders Manage, mentor, and lead a team of 2 Scrum Teams. Run the scrum ceremonies and build Agile Teams Empower, challenge, and coach team members to grow their impact as individual technical leaders by providing them with context and continuous feedback Foster a culture of mutual respect, collaboration, and data driven - consensus-based decision-making. Mentor and lead other engineers on the team to deliver features and enhancements. Defines and owns high level architecture designs and technology roadmap with the associated release goals. Defines and help develops POCs for new and upcoming strategic technology solutions. Defines team wide engineering best practices and mechanisms to measure the adoption. Drives technical architecture conversations with platform teams. Plan for team capacity and help drive recruitment of high-quality colleagues. Represent the team in internal/external technology events and conferences. Define and drive engineering metrics adoption across the teams. Key Tech Skills 12+ years of experience with any UI stack (React, Angular and/or Vue) and .Net Experience analysing, building, and implementing open-source technologies Experience in driving UX improvements, efficient UI development and metrics driven improvements. Experience with datastores (SQL, NOSQL, etc),API and event-based technologies (Kafka) Strong understanding of microservice based architectures and how to achieve composability Experience of Test-Driven Development approaches Experience delivering solutions that operate with large volumes of data and strict non-functional requirements around performance, flexibility, and scalability. Experience delivering solutions leveraging at least one leading cloud platform Broad knowledge of programming languages, operating system principles, and software development best practices. Who you are Experience building and leading high-performant remote engineering teams with high degrees of psychological safety. Proactively gives and receives feedback; is not afraid to have difficult conversations. Ability to seek different perspectives and relevant context to effectively navigate ambiguity. Ability to foster an inclusive team culture that enables team members of all backgrounds to succeed and thrive. Experience developing and reviewing complex technical requirements and translating them into actionable tasks for engineers. Experience leading cross-functional projects and been a main POC for their team across a large organization. Experience driving end-to-end architectural discussions across teams and organizations. Experience in helping leading development lifecycle process, best practices and operating distributed cloud-based micro services Experience of designing systems or components on large projects (5M). Strong preference for Open-Source experience. Strong communication skills, both verbal and written, with the ability to drive conversations across multiple teams to a conclusion

Posted Date not available

Apply

2.0 - 7.0 years

4 - 9 Lacs

gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 The Team: As a member of the Data Transformation team you will work on building ML powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global Market Intelligence and our clients. You will spearhead development of production-ready AI products and pipelines while leading-by-example in a highly engaging work environment. You will work in a (truly) global team and encouraged for thoughtful risk-taking and self-initiative. The Impact: The Data Transformation team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Key Responsibilities Design, Develop and Deploy ML powered products and pipelines Play a central role in all stages of the data science project life cycle, including: Identification of suitable data science project opportunities Partnering with business leaders, domain experts, and end-users to gain business understanding, data understanding, and collect requirements Evaluation/interpretation of results and presentation to business leaders Performing exploratory data analysis, proof-of-concept modelling, model benchmarking and setup model validation experiments Training large models both for experimentation and production Develop production ready pipelines for enterprise scale projects Perform code reviews & optimization for your projects and team Spearhead deployment and model scaling strategies Stakeholder management and representing the team in front of our leadership Leading and mentoring by example including project scrums What Were Looking For: 2+ years of professional experience in Data Science domain Expertise in Python (Numpy, Pandas, Spacy, Sklearn, Pytorch/TF2, HuggingFace etc.) Experience with SOTA models related to NLP and expertise in text matching techniques, including sentence transformers, word embeddings, and similarity measures Expertise in probabilistic machine learning model for classification, regression & clustering Strong experience in feature engineering, data preprocessing, and building machine learning models for large datasets. Exposure to Information Retrieval, Web scraping and Data Extraction at scale OOP Design patterns, Test-Driven Development and Enterprise System design SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Problem-solving and debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Prior work to show on Github, Kaggle, StackOverflow etc. Cloud expertise (AWS and GCP preferably) Expertise in deploying machine learning models in cloud environments Familiarity in working with LLMs

Posted Date not available

Apply

2.0 - 4.0 years

8 - 12 Lacs

bengaluru

Work from Office

We are excited to invite Java Developers with sound knowledge of industry best practices, good organizational and leadership skills to work on our product platform, Nuacare. Nuacare is a cutting-edge automation solution designed to drive digital transformation in hospitals. As part of our dynamic team, you'll work with the latest web and mobile technologies. This is a work-from-office position, and you will be required to work from our Bangalore office. If you are not currently in Bangalore or unable to join us soon at our Bangalore office, we kindly request you to skip this opportunity. Key Responsibilities: -Perform development tasks, including software design and coding. -Collaborate with teams in other departments to identify and prioritize requirements. -Participate in code reviews with the team and other stakeholders. -Conduct technical analysis to arrive at solutions and create technological artifacts in response to production issues. Required skills and experience: -Self-motivated developer with at least 2 years of application development experience, able to take the lead in implementing new features and components. -Experience working on Java, and SpringBoot. -Experience working in small, fast-paced, highly technical teams employing Agile/Scrum practices. -Knowledge of software best practices, like Test-Driven Development (TDD), Pair programming, Continuous Integration and Continuous Deployment. -Strong SQL skills and ability to write and understand complex queries. -Exposure to Linux environments. -BE/BTech//MSc Computer science or MCA graduate Why Join Us: -Innovative Environment: Be part of a forward-thinking team building cutting-edge web applications. -Career Growth: Opportunities to mentor junior developers and expand your technical skills. -Collaborative Culture: Work in a collaborative, inclusive, and diverse team environment where your ideas are valued.

Posted Date not available

Apply

5.0 - 10.0 years

10 - 18 Lacs

thiruvananthapuram

Remote

Position: Senior Backend Engineer (Node) Location: Remote About Us: Amazing Life is a well-funded and dynamic product startup, headquartered in Dallas, Texas, with a mission to revolutionize the sharing of life-transforming content. We aim to achieve this through a next-generation platform that empowers creators to build thriving communities around content using diverse media formats, such as blog posts, short videos, long-form podcasts, and courses. TechMission Solutions Private Limited, India, is the engineering wing of Amazing Life. We are looking for a Senior Backend Engineer with hands-on experience in Node, Postgres and/or Cassandra to join our Product and Engineering team in India. As a key member of our engineering department, you will be responsible for designing, building, and maintaining scalable backend systems with a focus on developing OpenAPI-based RESTful endpoints. The ideal candidate will have a deep understanding of backend services, data modeling, and database interaction, while following best practices in Test-Driven Development (TDD). Desired Qualities: Passion for excellence: We are looking for someone who is dedicated to delivering high-quality software products and continuously improving their skills. Motivated by team culture: We value collaboration, teamwork, and a supportive work environment. We want someone who is motivated by working with others and contributing to a positive team culture. Keen attention to detail: Our platform and apps aim to provide the best content consumption experience for our users. We want someone who takes pride in their work and strives for perfection. Strong desire to serve: Wed like to work with someone who is motivated by serving people and helping them grow. Responsibilities: Design, implement, and maintain scalable and efficient RESTful APIs using Node.js and TypeScript. Develop and document REST endpoints using OpenAPI specifications to ensure clarity and consistency. Ensure all endpoints are well-tested using Test-Driven Development (TDD) methodology. Collaborate with frontend developers, product managers, and other stakeholders to define API requirements. Design database schemas, optimize data models, and write queries for Postgres or Cassandra/ScyllaDB. (Yes! We use ScyllaDB!) Create generalized and reusable solutions for different backend needs to ensure scalability and maintainability. Design decoupled backend solutions using Event Driven architecture. Maintain code quality, performance, and security through regular code reviews, unit testing, and integration testing. Optimize existing systems and recommend improvements to ensure system performance and reliability. Mentor junior developers and foster a collaborative team environment. Qualifications: 5+ years of hands-on experience in backend development using Node and TypeScript . Strong experience with REST API design and development using OpenAPI specifications . Proven expertise in Test-Driven Development (TDD) , with a focus on writing clean, testable, and maintainable code. Experience designing database schemas, data models, and writing complex queries for relational databases such as PostgreSQL or NoSQL databases like Cassandra . Experience with AWS, Serverless and Docker is a plus. Excellent problem-solving skills and ability to provide robust solutions. Strong communication skills and ability to collaborate across teams. Experience with Git version control (We use Gitlab). Strong understanding of software development principles such as object-oriented programming, design patterns, and software testing. Passionate about learning new technologies and sharing knowledge with others. Experience working in a startup or fast-paced environment is a plus. Benefits: Competitive salary Flexible work arrangements A collaborative and supportive team culture Opportunities for growth and career development If you are a talented and passionate backend developer who is looking for an opportunity to make a difference in people's lives, we want to hear from you! Join our team of like minded professionals and help us create software products that inspire, educate, and help people flourish. Apply now and let's work together to build a better future! To Apply: Please submit your resume to aneesha.elizabath@techmissionsolutions.com

Posted Date not available

Apply

0.0 - 5.0 years

5 - 9 Lacs

gurugram

Work from Office

Oliver Wyman - Lead Data Engineer - Data and Analytics (DNA) - Gurugram, India Practice Overview Practice: Data and Analytics (DNA) - Analytics Consulting Role : Lead Data Engineer Location: Gurugram, India At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission: Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose: Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We have open positions ranging from Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Interview Process The application process will include testing technical proficiency, case study, and team-fit interviews. Please include a brief note introducing yourself, what youre looking for when applying for the role, and your potential value-add to our team. Roles and levels We are hiring for engineering role across the levels from Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years. In addition to the base salary, this position may be eligible for performance-based incentives. We offer a competitive total rewards package that includes comprehensive health and welfare benefits as well as employee assistance programs.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies