Jobs
Interviews

909 Nosql Databases Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 18.0 years

12 - 17 Lacs

Pune

Hybrid

What’s the role all about? You will be a key contributor to developing a multi-region, multi-tenant SaaS product. You will collaborate with the core R&D team, using technologies like .NET/C#, AWS, and Data to build scalable, high-performance products within a cloud-first, microservices-driven environment. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Key Responsibilities: Lead the design and implementation of software features in alignment with product specifications and adhere to High-Level Design (HLD) and Low-Level Design (LLD) standards. Lead the development of scalable, multi-tenant SaaS solutions. Collaborate with Product Management, R&D, UX, and DevOps teams to deliver seamless, end-to-end solutions. Advocate for and implement Continuous Integration and Delivery (CI/CD) practices to improve development efficiency and product quality. Mentor junior engineers, share knowledge, and promote best practices within the team. Assist in solving complex technical problems and enhance product functionality through innovative solutions. Conduct code reviews to ensure adherence to design principles and maintain high-quality standards. Plan and execute unit testing to verify functionality and ensure automation coverage. Contribute to the ongoing support of software features, ensuring complete quality coverage and responsiveness to any issues during the software lifecycle. Qualifications & Experience: Bachelor’s or Master’s degree in Computer Science, Electronics Engineering, or a related field from a reputed institute. More than 11 years of experience in software development with a strong focus on backend technologies and a track record of delivering complex projects. Expertise in C#, .NET for back-end development. Angular, Javascript, Typescript experience is an added advantage. Experience in developing high-performance, highly available, and scalable systems. Working knowledge of RESTful APIs Solid understanding of scalable and microservices architectures, performance optimization, and secure coding practices. Exceptional problem-solving skills and the ability to work on multiple concurrent projects. Experience working with public cloud platforms like AWS (preferred), Azure, and GCP. Proficiency in developing solutions that leverage both SQL and NoSQL databases. Hands-on experience with Continuous Integration and Delivery (CI/CD) practices using tools like Docker, Kubernetes, and other modern pipelines. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor

Posted 2 months ago

Apply

8.0 - 13.0 years

8 - 13 Lacs

Mumbai, Maharashtra, India

On-site

Design, build, and maintain various front-end and correspondingback-end platformcomponents, working with Product and Program Managers. Implement new user interfaces and business functionalities to meet evolving business and customer requirements, working with end users, with clear and concise documentation. Analyze and improve the performance of applications andrelated operationalworkflows to improve efficiency and throughput. Diagnose, research, and resolve software defects. Ensure software stability through documentation, code reviews,regression, unit,and user acceptance testing for smooth production operations. Lead all aspects of level 2 & 3 application support, ensuringsmooth operationof existing processes and meeting new business opportunities. Be a self-starter and work with minimal direction in a globallydistributed team. Role Essentials: A passion for engineering highly available, performantfull-stack applicationswith a Student of Markets and Technology attitude. Bachelors or Masters degree or equivalent experience in computer science or engineering with 8+ years of relevant experience. 3+ years of professional experience working in teams. VP-level candidates should have experience leading teams deliveringcritical applications. Experience in full-stack user-facing application development using web technologies(Angular, React, JavaScript) and Java-based REST API(Spring framework). Experience in testing frameworks such as Protractor, TestCafe, Jest. Knowledge in relational database development and at least one NoSQL Database (e.g., Apache Cassandra, MongoDB, etc.). Knowledge of software development methodologies (analysis,design, development,testing) and a basic understanding ofAgile/Scrum methodologyand practices

Posted 2 months ago

Apply

5.0 - 10.0 years

25 - 27 Lacs

Bengaluru

Work from Office

Job Description: We are looking for an experienced Backend Engineer to join our engineering team and contribute to building highly scalable, low-latency, and high-concurrency SaaS applications . The ideal candidate should have deep expertise in Java , cloud technologies (preferably AWS) , and experience working with both RDBMS and NoSQL databases. You will be involved in the full software development lifecycle, from design to deployment, ensuring performance, security, and maintainability. Key Responsibilities: Design, develop, and maintain high-performance, scalable, and secure backend services Build and maintain RESTful APIs to support client-side applications and services Work on transactional and concurrent systems that serve large-scale SaaS platforms Collaborate with frontend engineers, architects, and DevOps to build robust cloud-based systems Ensure code quality and performance by writing unit/integration tests and performing code reviews Troubleshoot, debug, and optimize existing systems for reliability and efficiency Follow Agile development practices and participate in daily stand-ups and sprint planning Mandatory Skills: 58 years of backend development experience in building SaaS or transactional web applications Strong hands-on experience with Java and web application frameworks like Spring, Spring Boot Experience working on high-concurrency , low-latency , and high-availability systems Solid experience with at least one RDBMS (e.g., PostgreSQL, MySQL) and one NoSQL DB (e.g., MongoDB, Cassandra) Expertise in cloud platforms – preferably AWS (e.g., EC2, S3, RDS, Lambda) Familiarity with application servers like Tomcat Strong knowledge of system design, data structures, and multithreading Experience with RESTful APIs and microservice architectures Nice to Have: Knowledge of Big Data technologies (e.g., Kafka, Hadoop, Spark) Familiarity with containerization tools like Docker and Kubernetes Experience with CI/CD tools and cloud deployment pipelines Exposure to other cloud platforms like GCP or Azure

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

o Bachelor's degree in computer science, Engineering, or a related field. o 3 to 5 years of professional software development experience with a strong focus on MERN stack. o Design, develop, and maintain full-stack web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js, TypeScript). o Proficiency in JavaScript, HTML, CSS, and front-end technologies, with in-depth knowledge of React.js, Node.js, and Express.js. o Familiarity with testing frameworks like Jest, Mocha, or Cypress. o Solid understanding of object-oriented programming principles and design patterns. o Experience with microservices architecture. o Knowledge/familiarity on containerization (e.g., Docker, Kubernetes). o Familiarity with any cloud platforms (e.g., AWS, Azure, Google Cloud) o Knowledge of database technologies such as SQL, NoSQL databases (MongoDB) o Knowledge of any other programming language (Python, GoLang or others) o Experience with Agile methodologies and DevOps practices. o Strong analytical and problem-solving skills. o Excellent communication and collaboration skills.

Posted 2 months ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Chennai, Bengaluru

Work from Office

job requisition idJR1027450 Job Summary: We are seeking a skilled Full Stack Developer to join our dynamic development team. The ideal candidate will have extensive experience in building scalable backend applications, microservices architecture, and working with NoSQL databases. The role requires a strong command over JavaScript and Typescript, with a focus on delivering high-quality, maintainable code. Excellent communication skills are essential for collaborating effectively within cross-functional teams. Overall Responsibilities: Design, develop, and maintain efficient, reliable, and scalable server-side applications using Node.js. Architect and implement microservices-based solutions to support business growth and flexibility. Collaborate with front-end developers, product managers, and QA teams to deliver seamless integrations and features. Write clean, well-documented code following best practices and coding standards. Optimize application performance, troubleshoot issues, and resolve bugs proactively. Participate in code reviews, contribute to technical documentation, and continuously improve development processes. Stay updated with the latest industry trends, tools, and technologies. Category-wise Technical Skills: Node.js: Strong expertise in developing server-side applications using Node.js frameworks and libraries. JavaScript & Typescript: Proficient in JavaScript; comfortable working with Typescript for type safety and better code maintainability. Microservices Architecture: Experience designing, developing, and deploying microservices-based systems. MongoDB: In-depth knowledge of NoSQL databases, particularly MongoDB, including schema design, indexing, and query optimization. Experience: 5 to 12 years of relevant industry experience in backend development using Node.js and related technologies. Day-to-Day Activities: Developing and maintaining backend APIs and services. Engaging in daily stand-ups, sprint planning, and Agile ceremonies. Collaborating with UI/UX designers, product teams, and other developers. Conducting code reviews and participating in testing procedures. Monitoring application health and performance metrics. Continuously exploring and integrating new tools and best practices. Qualifications & Soft Skills: Bachelors or Masters degree in Computer Science, Information Technology, or related field. Excellent verbal and written communication skills to articulate technical concepts clearly. Strong problem-solving skills with attention to detail. Ability to work effectively both independently and as part of a team. Adaptability to fast-paced environments and willingness to learn new technologies. Good time management and organizational skills.

Posted 2 months ago

Apply

8.0 - 10.0 years

15 - 20 Lacs

Pune, Hinjewadi

Work from Office

job requisition idJR1027363 Job Summary: We are looking for a skilled Senior Software Engineer with 8-10 years of experience in developing robust applications using Java 17 and related technologies. The ideal candidate will have a strong background in Microservices architecture, cloud-native design principles, and a deep understanding of both SQL and NoSQL databases along with Azure Cloud. This role requires a proactive individual who can work collaboratively in an Agile environment and contribute to the design and implementation of scalable solutions. Key Responsibilities: 1. Software Development: Design, develop, and maintain applications using Java 17, Spring Boot, Spring Batch, and Spring Data JPA. Implement Hibernate for ORM and ensure efficient database interactions. 2. Architectural Implementation: Develop applications following Microservices, Service-Oriented Architecture (SOA), and Event-Driven Architecture patterns. Apply design principles such as the 12-Factor App methodology and Cloud Native Architecture in application development. 3. Database Management: Design and optimize SQL databases (e.g., MS SQL Server, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) to meet application requirements. Implement in-memory caching strategies (e.g., Redis, Hazalcast) to enhance application performance. 4. DevOps and Deployment: Collaborate with DevOps teams to build CI/CD pipelines and automate deployment processes. Implement deployment strategies such as blue-green deployments to ensure minimal downtime and high availability. 5. Integration and Messaging: Work with Pub/Sub models to facilitate communication between microservices and enhance event-driven capabilities. Integrate APIs and services according to best practices and design specifications. 6. Collaboration and Agile Practices: Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives, to ensure smooth project execution. Collaborate with cross-functional teams to gather requirements and provide technical expertise. 7. Quality Assurance: Write unit and integration tests to ensure code quality and application stability. Conduct code reviews and provide constructive feedback to peers. Technical Skills and Qualifications: Core Technologies: Proficient in Java 17, Spring Boot, Spring Batch, Spring Data JPA, and Hibernate. Experience with Azure Cloud services and deployment practices. Architecture Patterns: Strong understanding of Microservices, SOA, and Event-Driven Architecture. Database Proficiency: Experience with SQL and NoSQL databases, including performance optimization techniques. Design Principles: Familiarity with design principles such as the 12-Factor App methodology and Cloud Native Architecture. DevOps Knowledge: Knowledge of DevOps tools and practices, including CI/CD pipelines and deployment strategies like blue-green deployment. Caching and Messaging: Experience with in-memory caching solutions and Pub/Sub messaging systems. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to work well in a team-oriented environment.

Posted 2 months ago

Apply

6.0 - 11.0 years

15 - 19 Lacs

Chennai

Work from Office

Overview Responsibilities:- Designing and architecting for web solutions that delivers a seamless omni channel experience. Evaluating and selecting the most suitable technologies and tools to improves omni channel experience and conversions with latest techniques such as Micro Frontends, AEM AEP, AI, Machine Learning and Cloud Native platforms. Must have skills :- Strong experience in Java/J2EE, Micro Services architecture & development, NextJS, Micro frontend with ReactJS, NoSQL databases(Cassandra), Session cache(Redis) and ElasticSearch. Experience in designing & development for large enterprise applications. Ability to translate business requirements into innovative and effective solutions/designs and ensure the successful implementation. Ability to analyze and report on project delivery metrics (planned vs. actual) to identify performance trends, risks, and opportunities for improvement. Strong communication and collaboration skills, with a talent for explaining complex concepts simply. Lead the requirement, design & development with Agile implementation methodology, And should work hand in hand with product area stakeholders. Collaborate with cross-functional teams to design, build, and act as the primary point of contact. Excellent problem-solving skills and troubleshooting skills w.r.t real time challenges of customers. Ability to design complex conversation flows and create engaging user experiences. Able to Drive process improvement initiatives by partnering with cross-functional teams to implement best practices and resolve strategic and organizational issues. Stay updated and keep the team updated with the latest advancements in web technologies, AI and integrating innovative approaches for sustained competitive advantage. Good to have :- Hands on experience in PEGA(Case Management, NBX) and GenAI & Personalization capabilities. Hands on experience on AWS & Google Clouds. Experience on ForgeRock access gateway & identity management platforms. Responsibilities Responsibilities:- Designing and architecting for web solutions that delivers a seamless omni channel experience. Evaluating and selecting the most suitable technologies and tools to improves omni channel experience and conversions with latest techniques such as Micro Frontends, AEM AEP, AI, Machine Learning and Cloud Native platforms. Must have skills :- Strong experience in Java/J2EE, Micro Services architecture & development, NextJS, Micro frontend with ReactJS, NoSQL databases(Cassandra), Session cache(Redis) and ElasticSearch. Experience in designing & development for large enterprise applications. Ability to translate business requirements into innovative and effective solutions/designs and ensure the successful implementation. Ability to analyze and report on project delivery metrics (planned vs. actual) to identify performance trends, risks, and opportunities for improvement. Strong communication and collaboration skills, with a talent for explaining complex concepts simply. Lead the requirement, design & development with Agile implementation methodology, And should work hand in hand with product area stakeholders. Collaborate with cross-functional teams to design, build, and act as the primary point of contact. Excellent problem-solving skills and troubleshooting skills w.r.t real time challenges of customers. Ability to design complex conversation flows and create engaging user experiences. Able to Drive process improvement initiatives by partnering with cross-functional teams to implement best practices and resolve strategic and organizational issues. Stay updated and keep the team updated with the latest advancements in web technologies, AI and integrating innovative approaches for sustained competitive advantage. Good to have :- Hands on experience in PEGA(Case Management, NBX) and GenAI & Personalization capabilities. Hands on experience on AWS & Google Clouds. Experience on ForgeRock access gateway & identity management platforms. Responsibilities:- Designing and architecting for web solutions that delivers a seamless omni channel experience. Evaluating and selecting the most suitable technologies and tools to improves omni channel experience and conversions with latest techniques such as Micro Frontends, AEM AEP, AI, Machine Learning and Cloud Native platforms. Must have skills :- Strong experience in Java/J2EE, Micro Services architecture & development, NextJS, Micro frontend with ReactJS, NoSQL databases(Cassandra), Session cache(Redis) and ElasticSearch. Experience in designing & development for large enterprise applications. Ability to translate business requirements into innovative and effective solutions/designs and ensure the successful implementation. Ability to analyze and report on project delivery metrics (planned vs. actual) to identify performance trends, risks, and opportunities for improvement. Strong communication and collaboration skills, with a talent for explaining complex concepts simply. Lead the requirement, design & development with Agile implementation methodology, And should work hand in hand with product area stakeholders. Collaborate with cross-functional teams to design, build, and act as the primary point of contact. Excellent problem-solving skills and troubleshooting skills w.r.t real time challenges of customers. Ability to design complex conversation flows and create engaging user experiences. Able to Drive process improvement initiatives by partnering with cross-functional teams to implement best practices and resolve strategic and organizational issues. Stay updated and keep the team updated with the latest advancements in web technologies, AI and integrating innovative approaches for sustained competitive advantage. Good to have :- Hands on experience in PEGA(Case Management, NBX) and GenAI & Personalization capabilities. Hands on experience on AWS & Google Clouds. Experience on ForgeRock access gateway & identity management platforms.

Posted 2 months ago

Apply

5.0 - 8.0 years

22 - 30 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in

Posted 2 months ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: Developer will play an integral role in the PTEIT Machine Learning Data Engineering team. Design, develop and support data pipelines in a hybrid cloud environment to enable advanced analytics. Design, develop and support CI/CD of data pipelines and services. - 5+ years of experience with Python or equivalent programming using OOPS, Data Structures and Algorithms - Develop new services in AWS using server-less and container-based services. - 3+ years of hands-on experience with AWS Suite of services (EC2, IAM, S3, CDK, Glue, Athena, Lambda, RedShift, Snowflake, RDS) - 3+ years of expertise in scheduling data flows using Apache Airflow - 3+ years of strong data modelling (Functional, Logical and Physical) and data architecture experience in Data Lake and/or Data Warehouse - 3+ years of experience with SQL databases - 3+ years of experience with CI/CD and DevOps using Jenkins - 3+ years of experience with Event driven architecture specially on Change Data Capture - 3+ years of Experience in Apache Spark, SQL, Redshift (or) Big Query (or) Snowflake, Databricks - Deep understanding building the efficient data pipelines with data observability, data quality, schema drift, alerting and monitoring. - Good understanding of the Data Catalogs, Data Governance, Compliance, Security, Data sharing - Experience in building the reusable services across the data processing systems. - Should have the ability to work and contribute beyond defined responsibilities - Excellent communication and inter-personal skills with deep problem-solving skills. Minimum Qualifications: 3+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 5+ years of IT-related work experience without a Bachelors degree. 2+ years of any combination of academic or work experience with programming (e.g., Java, Python). 1+ year of any combination of academic or work experience with SQL or NoSQL Databases. 1+ year of any combination of academic or work experience with Data Structures and algorithms. 5 years of Industry experience and minimum 3 years experience in Data Engineering development with highly reputed organizations- Proficiency in Python and AWS- Excellent problem-solving skills- Deep understanding of data structures and algorithms- Proven experience in building cloud native software preferably with AWS suit of services- Proven experience in design and develop data models using RDBMS (Oracle, MySQL, etc.) Desirable - Exposure or experience in other cloud platforms (Azure and GCP) - Experience working on internals of large-scale distributed systems and databases such as Hadoop, Spark - Working experience on Data Lakehouse platforms (One House, Databricks Lakehouse) - Working experience on Data Lakehouse File Formats (Delta Lake, Iceberg, Hudi) Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.

Posted 2 months ago

Apply

10.0 - 12.0 years

30 - 37 Lacs

Bengaluru

Work from Office

We need immediate joiners or those who are serving notice period and can join in another 10-15 days. No other candidate i.e. who are on bench or official 3, 2 months NP. Strong working experience in design and development of RESTful APIs using Java, Spring Boot and Spring Cloud. Technical hands-on experience to support development, automated testing, infrastructure and operations Fluency with relational databases or alternatively NoSQL databases Excellent pull request review skills and attention to detail Experience with streaming platforms (real-time data at massive scale like Confluent Kafka). Working experience in AWS services like EC2, ECS, RDS, S3 etc. Understanding of DevOps as well as experience with CI/CD pipelines Industry experience in Retail domain is a plus. Exposure to Agile Methodology and project tools: Jira, Confluence, SharePoint. Working knowledge in Docker Container/Kubernetes Excellent team player, ability to work independently and as part of a team Experience in mentoring junior developers and providing technical leadership Familiarity with Monitoring & Reporting tools (Prometheus, Grafana, PagerDuty etc). Ability to learn, understand, and work quickly with new emerging technologies, methodologies, and solutions in the Cloud/IT technology space Knowledge of front-end framework using React or Angular and any other programming languages like JavaScript/TypeScript or Python is a plus

Posted 2 months ago

Apply

4.0 - 9.0 years

27 - 42 Lacs

Chennai

Work from Office

Job Summary We are seeking a Sr. Developer with 6 to 10 years of experience specializing in Microsoft Azure IoT to join our team in a hybrid work model. The ideal candidate will have a strong background in Engineering & Design and Industrial Manufacturing. This role involves developing innovative IoT solutions that enhance our manufacturing processes and contribute to our companys growth and societal impact. Job Description We are seeking a Senior Full Stack Developer with extensive experience in both frontend and backend technologies. The ideal candidate will have proficiency in Knockout.js and Vue2+TypeScript, e nabling them to develop dynamic and responsive web applications. On the backend, strong experience with C#.NET, .Net Framework, .NET Core, WCF Services, C++, SQL, and NoSQ L databases such as Postgres is essential for building robust and scalable server-side solutions. Expertise in Azure cloud services, including Azure IoT, Azure Edge Runtime, Azure PaaS Services, and Azure Kubernetes Service (AKS), is required for deploying cloud-native applications. A deep understanding of design principles and patterns, as well as micro-service architecture, is necessary to create modular, maintainable, and resilient solutions. The candidate should be proficient with testing tools such as JUnit, Postman, Apache JMeter, Cucumber, and SonarQube to ensure application reliability and performance. Experience with CI/CD pipelines, including Jenkins and GitLab, as well as Docker and Kubernetes, is crucial for streamlining development workflows and managing containerized applications. Qualifications Possess strong expertise in Microsoft Azure IoT and its application in industrial settings. Demonstrate experience in Engineering & Design and Industrial Manufacturing domains. Exhibit proficiency in developing scalable and secure IoT architectures. Show ability to analyze complex data sets and derive meaningful insights. Have excellent problem-solving skills and attention to detail. Display strong communication skills to collaborate effectively with diverse teams. Be adaptable to evolving technologies and industry trends.

Posted 2 months ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Responsibilities: Development: Design, develop, and maintain scalable web applications using Next.js and .NET 8.0 framework. Collaborate with cross-functional teams to implement new features and enhance existing ones. Testing & Deployment: Write, review, and maintain automated unit test cases to ensure high code quality. Deploy and manage applications in Azure, leveraging Azure DevOps for CI/CD pipelines. Database Management: Work with SQL Server Management Studio (SSMS) and NoSQL databases for efficient data storage and retrieval. Design and optimize database schemas, queries, and indexing. Version Control: Manage and track code versions using Git, adhering to best practices for branching, tagging, and versioning. Fundamental Concepts: Apply solid knowledge of Object-Oriented Programming (OOP) principles. Leverage a strong foundation in design patterns and coding best practices. Cloud Integration: Utilize Azure services such as Azure Functions, App Services, and Azure SQL. Optimize cloud-based solutions for performance, scalability, and cost-efficiency. Technical Skills Required: Mandatory Skills: Proficiency in Next.js and .NET 8.0. Strong knowledge of automated unit testing frameworks. Experience with Azure services and Azure DevOps pipelines. Hands-on experience with both relational (SSMS) and NoSQL databases (e.g., MongoDB, Cosmos DB). Deep understanding of OOP principles and design patterns. Familiarity with branching, tagging, and versioning practices in Git Nice-to-Have Qualifications: Experience in front-end development with React.js. Knowledge of RESTful API. Familiarity with performance optimization and secure coding practices. Certifications in Azure or related technologies (preferred but not mandatory).

Posted 2 months ago

Apply

12.0 - 15.0 years

10 - 14 Lacs

Surat

Work from Office

Responsibilities: Architectural Leadership: Lead the design and evolution of enterprise-level Java applications, ensuring scalability, reliability, security, and maintainability. Define and enforce architectural standards, patterns, and best practices. Solution Design & Strategy: Translate complex business requirements into technical specifications and architectural blueprints. Evaluate and recommend appropriate technologies, frameworks, and tools to meet project goals. Technical Ownership: Take ownership of the technical design and implementation across the full software development lifecycle, from conception to deployment and ongoing support. Framework Expertise: Leverage extensive experience with the Spring Framework (Spring Boot, Spring Cloud, Spring Data, Spring Security, etc.) and Hibernate/JPA for data persistence. Database Design: Collaborate with database administrators to design efficient and scalable database schemas for both relational (e.g., PostgreSQL, MySQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra). Microservices & APIs: Design and implement robust microservices architectures, RESTful APIs, and event-driven systems. Performance & Scalability: Identify and resolve performance bottlenecks, ensuring optimal application performance and scalability under heavy loads. Conduct performance tuning and optimization. Security: Incorporate robust security practices and design patterns into application architecture, ensuring compliance with industry standards and best practices. Code Quality & Standards: Establish and enforce coding standards, perform regular code reviews, and ensure high code quality, testability, and adherence to architectural guidelines. Mentorship & Guidance: Mentor and guide senior and junior developers, fostering a culture of technical excellence, continuous learning, and innovation. Stakeholder Collaboration: Effectively communicate complex technical concepts to non-technical stakeholders, product owners, and other teams. Collaborate closely with cross-functional teams. Technology Evaluation: Research and evaluate new technologies, tools, and frameworks to keep the technology stack modern and competitive. DevOps & CI/CD: Work closely with DevOps teams to define and implement CI/CD pipelines, automated testing, and deployment strategies. Troubleshooting & Support: Provide expert-level technical support and troubleshooting for complex production issues.

Posted 2 months ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Key Responsibilities: Design and develop scalable backend services using Go (Golang) Collaborate with cross-functional teams to understand project requirements Write efficient, maintainable, and reusable code Debug and resolve production issues as they arise Ensure code quality through unit and integration testing Participate in code reviews and technical discussions Required Skills: Minimum 4+ years of experience in software development At least 2+ years of strong hands-on experience with Golang Solid understanding of REST APIs and microservices architecture Experience with relational and NoSQL databases Familiarity with Docker, Kubernetes, and CI/CD pipelines is a plus Excellent problem-solving and communication skills

Posted 2 months ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be responsible for developing, and maintaining software applications, components, and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role requires a experience in and a deep understanding of both front and back-end development. The Full Stack Software Engineer will work closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. The Full Stack Software Engineer will also contribute to design discussions and provide guidance on technical feasibility and best standards. Roles & Responsibilities: Develop complex software projects from conception to deployment, including delivery scope, risk, and timeline. Conduct code reviews to ensure code quality and adherence to best practices. Contribute to both front-end and back-end development using cloud technology. Provide ongoing support and maintenance for design system and applications, ensuring reliability, reuse and scalability while meeting accessibility and best standards. Develop innovative solutions using generative AI technologies. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges, software bugs and performance issues effectively. Stay updated with the latest trends and advancements. Analyze and understand the functional and technical requirements of applications, solutions, and systems and translate them into software architecture and design specifications. Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software. Work closely with cross-functional teams, including product management, stakeholders, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 1 to 3 years of experience in Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT or related field experience OR Diploma and 7 to 9 years of experience in Computer Science, IT or related field experience Must-Have Skills: Hands-on experience with various cloud services, understanding the pros and cons of various cloud services in well-architected cloud design principles. Experience with developing and maintaining design systems across teams. Hands-on experience with Full Stack software development. Proficient in programming languages such as JavaScript, Python, SQL/NoSQL. Familiarity with frameworks such as React JS visualization libraries. Strong problem-solving and analytical skills; ability to learn quickly; excellent communication and interpersonal skills. Experience with API integration, serverless, microservices architecture. Experience in SQL/NoSQL databases, vector databases for large language models. Experience with website development, understanding of website localization processes, which involve adapting content to fit cultural and linguistic contexts. Preferred Qualifications: Good-to-Have Skills: Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk). Experience with data processing tools like Hadoop, Spark, or similar. Experience with popular large language models. Experience with Langchain or llamaIndex framework for language models; experience with prompt engineering, model fine-tuning. Professional Certifications: Relevant certifications such as CISSP, AWS Developer certification, CompTIA Network+, or MCSE (preferred). Any SAFe Agile certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 2 months ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai, Maharastra

Work from Office

Grade Level (for internal use) : - 10 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform Responsibilities: Design and implement innovative software solutions to enhance S&P Ratings' cloud-based data platforms. Mentor a team of engineers fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development 7+ years of development experience in enterprise products, modern web development technologies Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Additional Preferred Qualifications: Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor Benefits: Health & Wellness: Health care coverage designed for the mind and body. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awardssmall perks can make a big difference.

Posted 2 months ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Job Title: Data Platform Developer Key Responsibilities As a Data Platform Developer, you will: Solution Design & Development: Design, build, and unit test applications on the Spark framework using Python (PySpark). Translate requirements into full-fledged and scalable PySpark-based applications for both batch and streaming requirements. Data Pipeline Development: Develop and execute data pipeline testing processes, and validate business rules and policies. Build integrated solutions leveraging Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, and HDFS compression codecs. Big Data Ecosystem Management: Apply in-depth knowledge of various Hadoop and NoSQL databases. Automation & CI/CD: Create and maintain integration and regression testing frameworks on Jenkins, integrated with Bitbucket and/or GIT repositories. Agile Collaboration: Participate actively in the Agile development process, documenting and communicating issues and bugs related to data standards in scrum meetings. Work collaboratively with both onsite and offshore teams. Technical Documentation: Develop and review technical documentation for delivered artifacts. Problem Solving & Triage: Solve complex data-driven scenarios and triage defects and production issues. Deployment & Release: Participate in code release and production deployment. Continuous Learning: Demonstrate an ability to learn-unlearn-relearn concepts with an open and analytical mindset, and be comfortable tackling new challenges and ways of working. Mandatory Skills & Experience Technical Proficiency: PySpark Expertise: Extensive experience in design, build, and deployment of PySpark-based applications (minimum 3 years). Hadoop Ecosystem: Minimum 3 years of experience in HIVE, YARN, HDFS . Spark: Ability to design, build, and unit test applications on the Spark framework on Python . Scripting & Databases: Proficiency in Unix shell scripting and experience with RDBMS . Strong hands-on experience writing complex SQL queries , exporting, and importing large amounts of data using utilities. NoSQL Databases: In-depth knowledge of various NoSQL databases . CI/CD Tools: Experience in creating and maintaining integration and regression testing frameworks on Jenkins integrated with Bitbucket and/or GIT repositories . Code Quality: Ability to build abstracted, modularized, and reusable code components. Big Data Environment: Expertise in handling complex large-scale Big Data environments (preferably 20TB+). Experience & Qualifications: Minimum 3 years of extensive experience in design, build, and deployment of PySpark-based applications. BE/B.Tech/ B.Sc. in Computer Science/ Statistics from an accredited college or university (Preferred). Prior experience with ETL tools, preferably Informatica PowerCenter , is advantageous. Essential Professional Skills Problem Solving: Ability to solve complex data-driven scenarios and triage defects and production issues. Adaptability & Learning: Able to quickly adapt and learn, comfortable tackling new challenges and new ways of working, and ready to move from traditional methods to agile ones. Communication & Collaboration: Excellent communication skills, strong collaboration and coordination across various teams, and comfortable challenging peers and leadership. Customer Centricity: Good Customer Centricity, strong Target & High Solution Orientation. Initiative: Able to jump into an ambiguous situation and take the lead on resolution. Agile Mindset: Comfortable moving from traditional methods and adapting to agile ones. Proactiveness: Can prove self quickly and decisively.

Posted 2 months ago

Apply

6.0 - 11.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Description We are seeking a skilled Azure Data Engineer with 6-11 years of experience to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining data solutions on the Azure platform. You will work closely with cross-functional teams to ensure the efficient processing and management of data, while also driving data-driven decision-making within the organization. If you are passionate about data and have a strong technical background in Azure services, we would love to hear from you. Responsibilities Design and implement data solutions using Azure services such as Azure Data Factory, Azure Databricks, and Azure SQL Database. Develop and maintain data pipelines for data ingestion, transformation, and storage. Ensure data quality and integrity by implementing data validation and cleansing processes. Collaborate with data scientists and analysts to understand data requirements and provide data access. Optimize performance of data processing and storage solutions in Azure. Monitor and troubleshoot data workflows and pipelines to ensure reliability and efficiency. Implement security and compliance measures for data handling and storage. Document data architecture and processes for future reference and onboarding. Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 6-11 years of experience in data engineering or a related field. Strong experience with Azure Data services (Azure Data Factory, Azure Databricks, Azure Synapse Analytics). Proficiency in SQL and experience with relational databases (e.g., Azure SQL Database, SQL Server). Knowledge of programming languages such as Python or Scala for data processing and ETL tasks. Experience with data modeling and database design principles. Familiarity with big data technologies (e.g., Apache Spark, Hadoop) and data warehousing concepts. Understanding of data governance and best practices in data security. Experience with CI/CD processes and DevOps practices for data solutions.

Posted 2 months ago

Apply

3.0 - 6.0 years

2 - 5 Lacs

Chennai, Tamil Nadu, India

On-site

The role involves developing scalable microservices with Spring Boot and Java 8, integrating NoSQL databases, and implementing messaging systems like RabbitMQ or Kafka. You'll enhance observability with Grafana and Prometheus, optimize performance using Hazelcast and Eureka, and utilize Docker and Kubernetes for deployment. Additionally, you'll focus on distributed systems best practices and fault tolerance. HOW YOU WILL CONTRIBUTE AND WHAT YOU WILL LEARN Develop robust and scalable microservices using Spring Boot and Java 8 advanced features. Integrate and maintain MongoDB or other NoSQL databases for efficient data management. Implement messaging systems (RabbitMQ, Kafka, VerneMQ) to enable seamless communication between services. Contribute to improving system observability by configuring Grafana and Prometheus for real-time monitoring. Enhance application performance and fault tolerance by utilizing Hazelcast and Eureka. Troubleshoot JVM performance issues and optimize resource utilization. Leverage Docker and Kubernetes for containerization, deployment, and orchestration of applications. Learn best practices for distributed systems, fault tolerance, and resilience at scale. KEY SKILLS AND EXPERIENCE You have : Graduate or Postgraduate in Engineering stream with 3+ years of relevant experience in Java 8. Expertise in building enterprise-grade applications using Spring Boot and applying advanced Java 8 features like streams, lambdas, and the Java time API. Experience with MongoDB or similar NoSQL databases for handling large-scale, unstructured data. Practical experience in integrating and working with messaging systems to enable asynchronous communication and scalability. It would be nice if you also had: Knowledge in setting up and configuring monitoring tools. Troubleshooting and optimization of JVM performance, including garbage collection tuning and memory management. Knowledge in Docker for containerization and Kubernetes for orchestration, ensuring efficient deployment and scaling.

Posted 2 months ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Hyderabad, Bengaluru

Work from Office

**URGENT hiring** Note: This is work from office opportunity. Apply only of your okay with it Must have Skills: MySQL, PostgreSQL, NoSQL, and Redshift Location: Bangalore/Hyderabad Years of experience: 5+ Years Notice period: immediate to 15 days Role Overview: The Database Engineer (DBE) is responsible for the design, implementation, maintenance, and optimization of databases to ensure high availability, security, and performance. The role involves working with relational and NoSQL databases, managing backups, monitoring performance, and ensuring data integrity. Key Responsibilities: Database Administration & Maintenance • Install, configure, and maintain database management systems (DBMS) such as MySQL, PostgreSQL, SQL Server, Oracle, or MongoDB. • Ensure database security, backup, and disaster recovery strategies are in place. • Monitor database performance and optimize queries, indexing, and storage. • Apply patches, updates, and upgrades to ensure system stability and security. Database Design & Development • Design and implement database schemas, tables, and relationships based on business requirements. • Develop and optimize stored procedures, functions, and triggers. • Implement data partitioning, replication, and sharding strategies for scalability. Performance Tuning & Optimization • Analyze slow queries and optimize database performance using indexing, caching, and tuning techniques. • Conduct database capacity planning and resource allocation. • Monitor and troubleshoot database-related issues, ensuring minimal downtime. Security & Compliance • Implement role-based access control (RBAC) and manage user permissions. • Ensure databases comply with security policies, including encryption, auditing, and GDPR/HIPAA regulations. • Conduct regular security assessments and vulnerability scans. Collaboration & Automation • Work closely with developers, system administrators, and DevOps teams to integrate databases with applications. • Automate database management tasks using scripts and tools. • Document database configurations, processes, and best practices. Required Skills & Qualifications: • Experience: 4+ years of experience in database administration, engineering, or related fields. • Education: Bachelors or Master’s degree in Computer Science, Information Technology, or related disciplines. • Technical Skills: • Strong knowledge of SQL and database optimization techniques. • Hands-on experience with at least one major RDBMS (MySQL, PostgreSQL, SQL Server, Oracle). • Experience with NoSQL databases (MongoDB, Cassandra, DynamoDB) is a plus. • Proficiency in database backup, recovery, and high availability solutions (Replication, Clustering, Mirroring). • Familiarity with scripting languages (Python, Bash, PowerShell) for automation. • Experience with cloud-based database solutions (AWS RDS, Azure SQL, Google Cloud Spanner). Preferred Qualifications: • Experience with database migration and cloud transformation projects. • Knowledge of CI/CD pipelines and DevOps methodologies for database management. • Familiarity with big data technologies like Hadoop, Spark, or Elasticsearch.

Posted 2 months ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Noida

Work from Office

About the Role: Grade Level (for internal use): 09 The Role: Platform Engineer Department overview PVR DevOps is a global team that provides specialized technical builds across a suite of products. DevOps members work closely with the Development, Testing and Client Services teams to build and develop applications using the latest technologies to ensure the highest availability and resilience of all services. Our work helps ensure that PVR continues to provide high quality service and maintain client satisfaction. Position Summary S&P Global is seeking a highly motivated engineer to join our PVR DevOps team in Noida. DevOps is a rapidly growing team at the heart of ensuring the availability and correct operation of our valuations, market and trade data applications. The team prides itself on its flexibility and technical diversity to maintain service availability and contribute improvements through design and development. Duties & accountabilities The role of Principal DevOps Engineer is primarily focused on building functional systems that improve our customer experience. Responsibilities include: Creating infrastructure and environments to support our platforms and applications using Terraform and related technologies to ensure all our environments are controlled and consistent. Implementing DevOps technologies and processes, e.gcontainerisation, CI/CD, infrastructure as code, metrics, monitoring etc Automating always Supporting, monitoring, maintaining and improving our infrastructure and the live running of our applications Maintaining the health of cloud accounts for security, cost and best practices Providing assistance to other functional areas such as development, test and client services. Knowledge, Skills & Experience Strong background of At least 3 to 5 years of experience in Linux/Unix Administration in IaaS / PaaS / SaaS models Deployment, maintenance and support of enterprise applications into AWS including (but not limited to) Route53, ELB, VPC, EC2, S3, ECS, SQS Good understanding of Terraform and similar Infrastructure as Code technologies Strong experience with SQL and NoSQL databases such MySQL, PostgreSQL, DB/2, MongoDB, DynamoDB Experience with automation/configuration management using toolsets such as Chef, Puppet or equivalent Experience of enterprise systems deployed as micro-services through code pipelines utilizing containerization (Docker) Working knowledge, understanding and ability to write scripts using languages including Bash, Python and an ability to understand Java, JavaScript and PHP Personal competencies Personal Impact Confident individual able to represent the team at various levels Strong analytical and problem-solving skills Demonstrated ability to work independently with minimal supervision Highly organised with very good attention to detail Takes ownership of issues and drives through the resolution. Flexible and willing to adapt to changing situations in a fast moving environment Communication Demonstrates a global mindset, respects cultural differences and is open to new ideas and approaches Able to build relationships with all teams, identifying and focusing on their needs Ability to communicate effectively at business and technical level is essential. Experience working in a global-team Teamwork An effective team player and strong collaborator across technology and all relevant areas of the business. Enthusiastic with a drive to succeed. Thrives in a pressurized environment with a can do attitude Must be able to work under own initiative

Posted 2 months ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Req ID: 327855 We are currently seeking a Python Django Microservices Lead to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesResponsibilities: Lead the development of backend systems using Django. Design and implement scalable and secure APIs. Integrate Azure Cloud services for application deployment and management. Utilize Azure Databricks for big data processing and analytics. Implement data processing pipelines using PySpark. Collaborate with front-end developers, product managers, and other stakeholders to deliver comprehensive solutions. Conduct code reviews and ensure adherence to best practices. Mentor and guide junior developers. Optimize database performance and manage data storage solutions. Ensure high performance and security standards for applications. Participate in architecture design and technical decision-making. Minimum Skills RequiredQualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 8+ years of experience in backend development. 8+ years of experience with Django. Proven experience with Azure Cloud services. Experience with Azure Databricks and PySpark. Strong understanding of RESTful APIs and web services. Excellent communication and problem-solving skills. Familiarity with Agile methodologies. Experience with database management (SQL and NoSQL). Skills: Django, Python, Azure Cloud, Azure Databricks, Delta Lake and Delta tables, PySpark, SQL/NoSQL databases, RESTful APIs, Git, and Agile methodologies"

Posted 2 months ago

Apply

5.0 - 8.0 years

22 - 30 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Role: Data Engineer Exp: 5 to 8 Years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 15 Days (Try to find only immediate joiners) Note: Candidate must have experience in Python, Kafka Streams, Pyspark, and Azure Databricks. Not looking for candidates who have only Exp in Pyspark and not in Python. Job Title: SSE Kafka, Python, and Azure Databricks (Healthcare Data Project) Experience: 5 to 8 years Role Overview: We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing . This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities: Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks . Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows . Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering . Stay updated with the latest cloud technologies, big data frameworks, and industry trends . Required Skills & Qualifications: 4+ years of experience in data engineering, with strong proficiency in Kafka and Python . Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing. Experience with Azure Databricks (or willingness to learn and adopt it quickly). Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus) . Proficiency in SQL, NoSQL databases, and data modeling for big data processing. Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications. Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus. Strong analytical skills, problem-solving mindset, and ability to lead complex data projects. Excellent communication and stakeholder management skills. Email: Sam@hiresquad.in

Posted 2 months ago

Apply

12.0 - 15.0 years

22 - 25 Lacs

Hyderabad, Chennai

Work from Office

Full-stack developer with expertise in Java, Spring Boot, React JS, Kafka, NoSQL (Cosmos, Cassandra), Azure Cloud, AKS, and Azure SQL. Domain focus: Retail-CPG, Logistics, and Supply Chain. Mail:kowsalya.k@srsinfoway.com

Posted 2 months ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Pune, Maharashtra, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools Erwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies