Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for developing and maintaining Node.js applications, utilizing your expertise in SQL and Kubernetes. Your role will involve deploying and managing applications on Kubernetes, working with Google Cloud Platform (GCP) services, and building and maintaining RESTful APIs or backend services using Node.js. You should also be familiar with Apache Kafka for message production and consumption, as well as PostgreSQL or similar relational databases for writing queries and basic schema design. Proficiency in Git and GitHub workflows for version control, and experience using Visual Studio Code (VSCode) or similar IDEs for development tools is essential. In addition to technical skills, clear communication in English, the ability to work effectively in distributed teams, and a strong problem-solving mindset are crucial for this role. You should be willing to learn new technologies and adapt to changing requirements. Experience with CI/CD pipelines, Agile methodologies, and monitoring/logging tools like Prometheus, Grafana, and ELK stack are preferred but not mandatory. This is a full-time position with a shift from 2-11PM, located in Hyderabad, Chennai, Bengaluru, or Pune. The annual budget for this role is 18LPA, including variable components. Health insurance is among the benefits offered, and the work schedule follows a UK shift pattern.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Full Stack Developer located in Pune, you will be responsible for designing, developing, and maintaining scalable Java-based backend services using Spring Boot and Microservices architecture. You will also build rich and dynamic front-end applications with React.js, integrating them with RESTful APIs. Leveraging Apache Kafka for building event-driven, real-time microservices and ensuring asynchronous communication between services will be a crucial part of your role. Your responsibilities will also include participating in the design and implementation of secure, scalable, and cloud-native solutions on platforms like AWS or similar. You must apply best practices for performance, reliability, scalability, and security throughout the software development lifecycle. Working with DevOps tools and containerization technologies such as Docker and Kubernetes to streamline deployment and CI/CD processes will be essential. Collaboration with cross-functional teams to gather and evaluate user requirements and translate them into technical specifications is a key aspect of the role. You will be expected to write efficient, clean, and testable code, perform thorough code reviews, and document application components. Supporting the development of training materials for QA and end users to ensure seamless integration of frontend and backend systems using JSON-based REST APIs is also part of your responsibilities. To excel in this role, you must possess strong communication skills, both verbal and written, along with relationship-building, collaborative, and organizational skills. Working effectively as a member of a matrix-based, diverse, and geographically distributed project team is crucial. Demonstrating ethics and values to foster high team trust is highly valued in this position. In return, we offer you the opportunity to drive impactful projects in a dynamic environment, continuous learning and career advancement opportunities, acknowledgment for innovative contributions, and the chance to lead initiatives with global impact. Our benefits package includes flexible schedules prioritizing well-being, relocation support, global opportunities for seamless transitions and international exposure, performance-based bonuses, annual rewards, and comprehensive well-being benefits such as Provident Fund and health insurance. Come join our team and make a difference with your expertise and skills.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a DevOps engineer at C1X AdTech Private Limited, a global technology company, your primary responsibility will be to manage the infrastructure, support development pipelines, and ensure system reliability. You will play a crucial role in automating deployment processes, maintaining server environments, monitoring system performance, and supporting engineering operations throughout the development lifecycle. Our objective is to design and manage scalable, cloud-native infrastructure using GCP services, Kubernetes, and Argo CD for high-availability applications. Additionally, you will implement and monitor observability tools such as Elasticsearch, Logstash, and Kibana to ensure full system visibility and support performance tuning. Enabling real-time data streaming and processing pipelines using Apache Kafka and GCP DataProc will be a key aspect of your role. You will also be responsible for automating CI/CD pipelines using GitHub Actions and Argo CD to facilitate faster, secure, and auditable releases across development and production environments. Your responsibilities will include building, managing, and monitoring Kubernetes clusters and containerized workloads using GKE and Argo CD, designing and maintaining CI/CD pipelines using GitHub Actions integrated with GitOps practices, configuring and maintaining real-time data pipelines using Apache Kafka and GCP DataProc, managing logging and observability infrastructure using Elasticsearch, Logstash, and Kibana (ELK stack), setting up and securing GCP services including Artifact Registry, Compute Engine, Cloud Storage, VPC, and IAM, implementing caching and session stores using Redis for performance optimization, and monitoring system health, availability, and performance with tools like Prometheus, Grafana, and ELK. Collaboration with development and QA teams to streamline deployment processes and ensure environment stability, as well as automating infrastructure provisioning and configuration using Bash, Python, or Terraform will be essential aspects of your role. You will also be responsible for maintaining backup, failover, and recovery strategies for production environments. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Engineering, or a related technical field with at least 4-8 years of experience in DevOps, Cloud Infrastructure, or Site Reliability Engineering. Strong experience with Google Cloud Platform (GCP) services including GKE, IAM, VPC, Artifact Registry, and DataProc is required. Hands-on experience with Kubernetes, Argo CD, and GitHub Actions for CI/CD workflows, proficiency with Apache Kafka for real-time data streaming, experience managing ELK Stack (Elasticsearch, Logstash, Kibana) in production, working knowledge of Redis for distributed caching and session management, scripting/automation skills using Bash, Python, Terraform, etc., solid understanding of containerization, infrastructure-as-code, and system monitoring, and familiarity with cloud security, IAM policies, and audit/compliance best practices are also essential qualifications for this role.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Performance Tester with 6-9 years of experience, you will play a crucial role in ensuring the responsiveness, scalability, and stability of applications under various load conditions. Your expertise in performance testing tools and methodologies, combined with a strong understanding of Java, Apache Kafka, and Microsoft technologies, will be instrumental in analyzing and improving application performance. Your key responsibilities will include designing, developing, and executing performance test plans, scripts, and scenarios. You will conduct load, stress, scalability, and endurance tests using industry-standard tools and analyze the results to collaborate with development teams in resolving bottlenecks. Your role will also involve collaborating with cross-functional teams to understand system architecture and performance requirements, monitoring system performance during test execution, and contributing to continuous improvement in performance testing strategy. To excel in this role, you must possess strong hands-on experience with performance testing tools like JMeter, LoadRunner, or Gatling. Proficiency in Java for scripting and backend logic analysis is essential, along with experience in Apache Kafka, including knowledge of producers, consumers, and message flow. Familiarity with Microsoft technologies such as .NET applications, Azure environments, or SQL Server will be advantageous. A good understanding of software development life cycles (Agile/Scrum) and CI/CD pipelines, along with the ability to identify performance bottlenecks across different system layers, will be key to your success in this role.,
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Technical Security Architect specializing in Confluent Cloud and AWS Data Platform, you will be responsible for supporting security design, governance, and assurance activities for a project hosted on Confluent Cloud and AWS. Your role will involve collaborating with platform engineers, data architects, and application teams to ensure security is integrated into the solution design and deployment process. Additionally, you will lead threat modeling and risk assessments for key components and integrations. Your key responsibilities will include defining and documenting secure architecture patterns and guardrails for Confluent Cloud, Kafka pipelines, and associated AWS infrastructure. You will review and write technical design and security documentation, ensuring compliance with internal security policies, industry best practices, and regulatory requirements. Furthermore, you will map security control requirements to organizational policies and contribute to the development of platform-specific security controls and operational playbooks. In terms of skills and experience, you should have at least 10 years of overall experience in cloud technologies, with a minimum of 5 years in a security architect role. You must have proven experience as a Security Architect on cloud-native platforms, preferably AWS, and familiarity with AWS security services such as IAM, KMS, VPC, GuardDuty, and Security Hub. A strong understanding of security principles across identity, network, data, and application layers is essential, along with knowledge of Confluent Cloud, Apache Kafka, and secure data streaming practices. Experience in writing and reviewing technical documentation clearly and concisely is required, as well as working with cross-functional DevOps, data, and infrastructure teams. Desirable skills include experience in Agile project environments and tools like Jira and Confluence, as well as previous involvement in enterprise data platform or streaming data projects. NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. Committed to helping clients innovate, optimize, and transform for long-term success, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, and application, infrastructure, and connectivity management. As one of the leading providers of digital and AI infrastructure, NTT DATA is part of the NTT Group, investing significantly in R&D to support organizations and society in confidently moving into the digital future.,
Posted 1 week ago
10.0 - 12.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Budget: As per industry standards Please share Current CTC, Expected CTC, and Notice Period when applying. We are seeking an experienced Java Backend Developer / Technical Lead with strong hands-on expertise in designing and implementing scalable backend services, RESTful APIs, and microservices using the Spring ecosystem. What You Will Do: Design and develop scalable RESTful Web Services. Lead development using Spring Boot, Spring Framework, and Microservices architecture. Work with databases such as Oracle, PostgreSQL, SQL Server, or MySQL. Integrate real-time messaging solutions like Apache Kafka. Collaborate with DevOps for CI/CD using tools like Jenkins, GitLab, etc. Leverage cloud platforms (AWS, Azure) for scalable deployments. Follow Agile practices and ensure quality-first development and clean architecture. Architect, design, and implement complex backend systems and mentor junior engineers. Primary Skills Required: Java/J2EE, Spring Boot, Spring AOP/DI Kafka or similar messaging frameworks REST API Development Oracle / SQL Server / PostgreSQL / MySQL CI/CD tools (Jenkins, GitLab) Cloud Experience: AWS or Azure Strong understanding of Agile methodologies Proven experience leading or mentoring software teams Qualifications: B.Tech in Computer Science or equivalent Minimum 10+ years of relevant backend development experience Excellent communication, leadership, and technical design skills Apply Now If interested, please share your resume along with: Current CTC Expected CTC Notice Period Current Location / Willingness to Relocate to Hyderabad
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
You will be working with MCX, where your career growth path will be sharpened to help you excel. At MCX, the success is attributed to the employees" domain expertise and commitment. The recruitment process at MCX focuses on finding the right fit between roles and profiles, offering exciting and challenging career opportunities for ambitious and result-oriented professionals. It provides a platform to realize your potential in your chosen area of expertise. As a Manager - Java Full Stack Developer, based in Mumbai, you are required to have a Bachelor's degree in computer science, Information Technology, or a related field, along with 8 to 10 years of overall experience in software development, with at least 6 years as a full stack developer. You should have proficiency in React for frontend development, strong experience in backend development using Java Spring Boot, expertise in designing and consuming RESTful APIs, hands-on experience with Apache Kafka, and knowledge of relational and non-relational databases. Additionally, you should possess strong problem-solving skills, attention to detail, and domain knowledge in trading exchanges, financial markets, and surveillance systems. Your responsibilities will include full stack development, building responsive and interactive frontend interfaces using React, designing and implementing secure, scalable, and efficient backend services using Java Spring Boot, developing RESTful APIs for seamless frontend-backend communication, integrating APIs with third-party services and trading exchange systems, managing real-time data streaming solutions using Apache Kafka, and incorporating domain knowledge to enhance features like fraud detection, trading behavior monitoring, and risk management. You will collaborate with cross-functional teams, provide technical guidance to team members, monitor system performance, resolve application issues, optimize performance for both frontend and backend components, implement secure coding practices, and ensure compliance with financial industry standards and regulatory guidelines. If you need further assistance, you can contact MCX at 022-67318888 / 66494000 or email at careers@mcxindia.com.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
We are searching for a talented Java Kafka Developer to join our dynamic engineering team. As a Senior Java Kafka Developer, your responsibilities will include designing, developing, and maintaining real-time data streaming applications using Apache Kafka and Java technologies. Your role will be vital in constructing scalable and fault-tolerant event-driven systems that cater to our business requirements. This is a full-time employment opportunity based in Hyderabad. In this role, you will be expected to design, develop, and maintain robust Java applications utilizing the Spring Framework. You will also be responsible for developing and managing event-driven systems with Apache Kafka, as well as implementing enterprise integration solutions using Apache Camel. Collaborating with cross-functional teams to gather and analyze requirements, optimizing performance, conducting code reviews, writing unit/integration tests, and mentoring junior developers are also key aspects of this role. To be successful in this position, you should possess 6 to 10 years of hands-on experience in Java development. Strong expertise in the Spring Framework (Spring Boot, Spring MVC, Spring Cloud), solid experience with Apache Kafka, hands-on experience with Apache Camel, understanding of RESTful APIs, microservices architecture, messaging systems, build tools, version control, CI/CD pipelines, and database technologies are essential. Additionally, strong problem-solving skills, the ability to work in a fast-paced environment, excellent communication, and interpersonal skills are required. In return, we offer a competitive salary and benefits package, a culture focused on talent development with quarterly promotion cycles, opportunities to work with cutting-edge technologies, employee engagement initiatives, annual health check-ups, and various insurance coverages. We are committed to fostering diversity and inclusion in the workplace, providing hybrid work options, flexible hours, and ensuring accessible facilities for employees with disabilities. At Persistent, we strive to create a values-driven and people-centric work environment where employees can accelerate growth, impact the world positively, enjoy collaborative innovation, and unlock global opportunities. If you are ready to unleash your full potential, join us at Persistent, an Equal Opportunity Employer that values diversity and prohibits discrimination and harassment.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As an experienced Integration Architect, your primary responsibility will be to define and implement integration architecture and strategies that align with the product ecosystem. You will lead technical discussions with customer teams to understand their business, functional, and non-functional requirements, and then translate them into integration designs. Your expertise will be crucial in architecting, designing, and developing system integrations using API-first, event-driven, and publish-subscribe models. In this role, you will provide technical guidance on best practices for API management, security, scalability, and data governance. Working closely with cross-functional teams, such as product engineering, professional services, and customer success, you will ensure smooth integration implementations. It will be your responsibility to implement data validation, transformation, and normalization processes to maintain data integrity and comply with the data model. Your role will also involve troubleshooting, debugging, and optimizing integration performance to ensure scalability and resilience. You will stay updated with emerging technologies in cloud integration, API management, and middleware solutions, integrating them into the existing framework. Additionally, you will develop and maintain integration documentation, best practices, and coding standards. To excel in this role, you should have over 10 years of experience in integration architecture, middleware solutions, and API management. A strong understanding of cloud architecture, hybrid integration, and microservices-based integrations is essential. Hands-on experience with API design, RESTful services, SOAP, Webhooks, and event-driven architectures will be required. Expertise in data formats like JSON, XML, and EDI, along with integration security protocols, is crucial. Experience with integration frameworks such as MuleSoft, Boomi, Apache Kafka, or equivalent is preferred. Knowledge of integrating with enterprise applications like SAP, Oracle NetSuite, Salesforce, Oracle Utilities, and other industry-specific platforms will be beneficial. You should possess strong problem-solving skills and the ability to collaborate effectively with internal teams and customers to overcome technical challenges. The role requires you to manage multiple projects with competing priorities and deliver within deadlines. Excellent communication skills are necessary to engage with both technical and non-technical stakeholders effectively. Your contribution in ensuring seamless connectivity and data flow across the ecosystem will enhance the overall customer experience and platform efficiency.,
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for a Technical Security Architect with expertise in Confluent Cloud and AWS Data Platform to join their team in Bangalore, Karnataka, India. As a Technical Security Architect, you will be responsible for supporting security design, governance, and assurance activities for a Confluent Cloud and AWS-hosted data platform project. Your role will involve defining secure architecture patterns, collaborating with various teams, and ensuring security is integrated into solution design and deployment. Key Responsibilities: - Define and document secure architecture patterns and guardrails for Confluent Cloud, Kafka pipelines, and associated AWS infrastructure. - Collaborate with platform engineers, data architects, and application teams to embed security in solution design. - Lead threat modeling and risk assessments for key components and integrations. - Review and write technical design and security documentation, ensuring adherence to internal security policies and compliance requirements. - Map security control requirements to organizational policies and regulatory frameworks. - Support project management by acting as a key security stakeholder, tracking security deliverables, and communicating effectively with project managers and stakeholders. Skills & Experience: Mandatory: - 10+ years of experience in cloud technologies with at least 5 years in a security architect role. - Proven experience as a Security Architect on cloud-native platforms, preferably AWS. - Familiarity with AWS security services such as IAM, KMS, VPC, GuardDuty, Security Hub. - Strong understanding of security principles across identity, network, data, and application layers. - Knowledge of Confluent Cloud, Apache Kafka, and secure data streaming practices. - Ability to write and review technical documentation clearly and concisely. - Experience working with cross-functional DevOps, data, and infrastructure teams. Desirable: - Experience in Agile project environments and tools like Jira and Confluence. - Previous involvement in enterprise data platform or streaming data projects. Join NTT DATA, a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts in over 50 countries, NTT DATA offers services in business and technology consulting, data and artificial intelligence, industry solutions, and application development and management. Be part of a leading provider of digital and AI infrastructure and contribute to a sustainable digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Python Back End Developer, you will be responsible for various day-to-day tasks related to back-end web development, software development, and object-oriented programming (OOP). This contract role offers a hybrid work model, with the primary location being Hyderabad and some flexibility for work-from-home. You should possess proficiency in Back-End Web Development and Software Development, along with a strong understanding of Object-Oriented Programming (OOP). Basic skills in Front-End Development, solid programming skills, and experience with Cloud platforms such as GCP/AWS are essential for this role. Additionally, excellent problem-solving and analytical skills are required, enabling you to work effectively both independently and as part of a team. Experience with Python frameworks like Django or Flask would be advantageous. It would also be beneficial if you have worked on data pipelines using tools like Airflow, Netflix Conductor, etc., and have experience with Apache Spark/beam and Kafka. This role offers an exciting opportunity for a skilled Python Back End Developer to contribute to a dynamic team and work on challenging projects in a collaborative environment.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be responsible for developing and maintaining high-performance server-side applications in Python following SOLID design principles. You will design, build, and optimize low-latency, scalable applications and integrate user-facing elements with server-side logic via RESTful APIs. Maintaining ETL and Data pipelines, implementing secure data handling protocols, and managing authentication and authorization across systems will be crucial aspects of your role. Additionally, you will ensure security measures and setup efficient deployment practices using Docker and Kubernetes. Leveraging caching solutions for enhanced performance and scalability will also be part of your responsibilities. To excel in this role, you should have strong experience in Python and proficiency in at least one Python web framework such as FastAPI or Flask. Familiarity with ORM libraries, asynchronous programming, event-driven architecture, and messaging tools like Apache Kafka or RabbitMQ is required. Experience with NoSQL and Vector databases, Docker, Kubernetes, and caching tools like Redis will be beneficial. Additionally, you should possess strong unit testing and debugging skills and the ability to utilize Monitoring and Logging frameworks effectively. You should have a minimum of 1.5 years of professional experience in backend development roles with Python. Your expertise in setting up efficient deployment practices, handling data securely, and optimizing application performance will be essential for success in this position.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an experienced Java Developer with over 5 years of expertise, you have a strong background in building scalable, distributed, and high-performance microservices utilizing Spring Boot and Apache Kafka. Your proficiency lies in designing and developing event-driven architectures, RESTful APIs, and integrating real-time data pipelines. You are well-versed in the full software development life cycle (SDLC), CI/CD practices, and Agile methodologies. Your key skills include Java (8/11/17), Spring Boot, Spring Cloud, Apache Kafka (Producer, Consumer, Streams, Kafka Connect), Microservices Architecture, RESTful Web Services, Docker, Kubernetes (basic knowledge), CI/CD (Jenkins, Git, Maven), Relational and NoSQL Databases (MySQL, PostgreSQL, MongoDB), Monitoring (ELK Stack, Prometheus, Grafana - basic), Agile/Scrum methodology, and Unit and Integration Testing (JUnit, Mockito). In your professional journey, you have developed and maintained multiple Kafka-based microservices handling real-time data ingestion and processing for high-volume applications. Your expertise extends to implementing Kafka consumers/producers with error-handling, retries, and idempotency for robust message processing. Additionally, you have designed and deployed Spring Boot microservices integrated with Kafka, PostgreSQL, Redis, and external APIs, showcasing your leadership in performance tuning and optimization to ensure low-latency and fault-tolerant behavior. If you are passionate about leveraging your skills in Java, Spring Boot, Apache Kafka, and microservices architecture to drive impactful projects and contribute to cutting-edge technologies, this opportunity might be the perfect match for you. Thank you for considering this role. Best regards, Renuka Thakur renuka.thakur@eminds.ai,
Posted 1 week ago
11.0 - 15.0 years
0 Lacs
haryana
On-site
You are an experienced software engineer with over 11 years of total experience. You have a strong background in architecture and development using Java 8 or higher. Your expertise includes working with Spring Boot, Spring Cloud, and related frameworks. You possess a deep understanding of Object-Oriented Programming and various Design Patterns. You have hands-on experience in building and maintaining microservices architecture in cloud or hybrid environments. Your skills include working with REST APIs, Caching systems like Redis, multithreading, and cloud development. You are proficient in Apache Kafka, including Kafka Streams, Kafka Connect, and Kafka clients in Java. Additionally, you have worked with SQL and NoSQL databases such as MySQL, PostgreSQL, and MongoDB. You are familiar with DevOps tools and technologies like Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. You have proven expertise in CI/CD pipelines using Azure DevOps, Jenkins, or GitLab CI/CD. Your experience includes using build automation tools like Maven, Ant, and Gradle. You have hands-on experience with cloud technologies such as AWS, Azure, or GCP, and have knowledge of Snowflake or equivalent cloud data platforms. Understanding predictive analytics and basic ML/NLP workflows is part of your skill set. You have a strong grasp of UML and design patterns. With excellent problem-solving skills and a passion for continuous improvement, you can communicate effectively and collaborate with cross-functional teams. Your responsibilities include writing and reviewing high-quality code, analyzing clients" needs, and envisioning solutions for functional and non-functional requirements. You will implement design methodologies, coordinate application development activities, and lead/support UAT and production rollouts. Additionally, you will estimate effort for tasks, address issues promptly, and provide constructive feedback to team members. You will troubleshoot and resolve complex bugs, propose solutions during code/design reviews, and conduct POCs to validate design/technologies. You hold a bachelor's or master's degree in computer science, Information Technology, or a related field.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
Are you passionate about building scalable data pipelines and working with real-time streaming platforms Join our growing team as a Data Engineer and help power next-gen data solutions! As a Data Engineer, you will be responsible for designing and maintaining real-time data pipelines using Apache Kafka. You will write efficient and optimized SQL queries for data extraction and transformation. Additionally, you will build robust ETL/ELT processes for structured & unstructured data and collaborate with analysts, data scientists & developers to deliver insights. Ensuring data quality, security & performance optimization will be a crucial part of your role. Integration with tools like Spark, Airflow, or Snowflake (as applicable) will also be part of your responsibilities. We value proficiency in Apache Kafka, Kafka Streams or Kafka Connect, strong skills in SQL, Python/Scala, and cloud platforms (AWS/GCP/Azure), experience with data lakes, message queues, and large-scale systems, as well as a problem-solving mindset and a passion for clean, efficient code. Working with us will involve exciting projects with global clients, a collaborative and innovation-driven environment, flexible working options, and competitive compensation. If you are excited about this opportunity, apply now at yukta.sengar@in.spundan.com or tag someone perfect for this role!,
Posted 1 week ago
7.0 - 12.0 years
10 - 20 Lacs
Bengaluru
Work from Office
8+ Years of exp in Database Technologies: AWS Aurora-PostgreSQL, NoSQL,DynamoDB, MongoDB,Erwin data modeling Exp in pg_stat_statements, Query Execution Plans Exp in Apache Kafka,AWS Kinesis,Airflow,Talend.AWS Exp in CloudWatch,Prometheus,Grafana, Required Candidate profile Exp in GDPR, SOC2, Role-Based Access Control (RBAC), Encryption Standards. Exp in AWS Multi-AZ, Read Replicas, Failover Strategies, Backup Automation. Exp in Erwin, Lucidchart, Confluence, JIRA.
Posted 1 week ago
5.0 - 6.0 years
5 - 6 Lacs
Pune
Work from Office
• Design, develop, and maintain scalable using .NET Core and C#. • Work with Entity Framework Core and LINQ • Develop and optimize complex SQL queries • Integrate and manage Apache Kafka • Develop and maintain Windows Services
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Kolkata
Work from Office
Responsibilities : - Hands-on development in Golang to deliver trustworthy and smooth functionalities to our users - Monitor, debug, and fix issues in production at high velocity based on user impact - Maintain good code coverage for all new development, with well-written and testable code - Write and maintain clean documentation for software services - Integrate software components into a fully functional software system - Comply with project plans with a sharp focus on delivery timelines Requirement : - Bachelor's degree in computer science, information technology, or a similar field - Must have 3+ years of experience in developing highly scalable, performant web applications - Strong problem-solving skills and experience in application debugging - Hands-on experience of Restful services development using Golang - Hands-on working experience with database; SQL (PostgreSQL / MySQL) - Working experience of message streaming/queuing systems like Apache Kafka, RabbitMQ, - Cloud experience with Amazon Web Services (AWS) - Experience with Serverless Architectures (AWS) would be a plus - Hands-on experience with API / Echo framework.
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Nashik
Work from Office
Responsibilities : - Hands-on development in Golang to deliver trustworthy and smooth functionalities to our users - Monitor, debug, and fix issues in production at high velocity based on user impact - Maintain good code coverage for all new development, with well-written and testable code - Write and maintain clean documentation for software services - Integrate software components into a fully functional software system - Comply with project plans with a sharp focus on delivery timelines Requirement : - Bachelor's degree in computer science, information technology, or a similar field - Must have 3+ years of experience in developing highly scalable, performant web applications - Strong problem-solving skills and experience in application debugging - Hands-on experience of Restful services development using Golang - Hands-on working experience with database; SQL (PostgreSQL / MySQL) - Working experience of message streaming/queuing systems like Apache Kafka, RabbitMQ, - Cloud experience with Amazon Web Services (AWS) - Experience with Serverless Architectures (AWS) would be a plus - Hands-on experience with API / Echo framework
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Nagpur
Work from Office
Responsibilities : - Hands-on development in Golang to deliver trustworthy and smooth functionalities to our users - Monitor, debug, and fix issues in production at high velocity based on user impact - Maintain good code coverage for all new development, with well-written and testable code - Write and maintain clean documentation for software services - Integrate software components into a fully functional software system - Comply with project plans with a sharp focus on delivery timelines Requirement : - Bachelor's degree in computer science, information technology, or a similar field - Must have 3+ years of experience in developing highly scalable, performant web applications - Strong problem-solving skills and experience in application debugging - Hands-on experience of Restful services development using Golang - Hands-on working experience with database; SQL (PostgreSQL / MySQL) - Working experience of message streaming/queuing systems like Apache Kafka, RabbitMQ, - Cloud experience with Amazon Web Services (AWS) - Experience with Serverless Architectures (AWS) would be a plus - Hands-on experience with API / Echo framework
Posted 1 week ago
5.0 - 9.0 years
7 - 14 Lacs
Lucknow, Bengaluru
Hybrid
Work Timings: 6:30 PM to 3:30 AM Java/JEE/Jakarta EE: Core Java, Multithreading, Concurrency, Collections, OOP Microservices: MicroProfile, Open Liberty, RESTful APIs (JAX-RS), JSONB/P Messaging: Apache Kafka (Producers, Consumers, Streams) Caching: Redis (Cache Management, Data Structures) Database: JDBC, SQL, Data Source Configuration, Transaction Management Web Technologies: WebSockets, Servlets, JSP Frontend Development: JavaScript, JSP Frameworks: ReactJS, React Native, Bootstrap, JQuery Libraries: jQuery Web Fundamentals: HTML5, CSS3, JSON, XML DevOps & Cloud: Containerization/Orchestration: Docker, Kubernetes, OpenShift CI/CD: Quickbuild or similar
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
Job Description/Preferred Qualifications We are seeking a highly skilled and motivated MLOps Site Reliability Engineer (SRE) to join our team. In this role, you will be responsible for ensuring the reliability, scalability, and performance of our machine learning infrastructure. You will work closely with data scientists, machine learning engineers, and software developers to build and maintain robust and efficient systems that support our machine learning workflows. This position offers an exciting opportunity to work on cutting-edge technologies and make a significant impact on our organization's success. Responsibilities : Design, implement, and maintain scalable and reliable machine learning infrastructure. Collaborate with data scientists and machine learning engineers to deploy and manage machine learning models in production. Develop and maintain CI/CD pipelines for machine learning workflows. Monitor and optimize the performance of machine learning systems and infrastructure. Implement and manage automated testing and validation processes for machine learning models. Ensure the security and compliance of machine learning systems and data. Troubleshoot and resolve issues related to machine learning infrastructure and workflows. Document processes, procedures, and best practices for machine learning operations. Stay up-to-date with the latest developments in MLOps and related technologies. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Site Reliability Engineer (SRE) or in a similar role. Strong knowledge of machine learning concepts and workflows. Proficiency in programming languages such as Python, Java, or Go. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with containerization technologies like Docker and Kubernetes. Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Strong problem-solving skills and the ability to troubleshoot complex issues. Excellent communication and collaboration skills. Preferred Qualifications: Master's degree in Computer Science, Engineering, or a related field. Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn. Knowledge of data engineering and data pipeline tools such as Apache Spark, Apache Kafka, or Airflow. Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK stack. Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible. Experience with automated testing frameworks for machine learning models. Knowledge of security best practices for machine learning systems and data. Minimum Qualifications Master's Level Degree or Bachelor's Level Degree and related work experience of 2 years
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
Job Description/Preferred Qualifications We are seeking a highly skilled and motivated MLOps Site Reliability Engineer (SRE) to join our team. In this role, you will be responsible for ensuring the reliability, scalability, and performance of our machine learning infrastructure. You will work closely with data scientists, machine learning engineers, and software developers to build and maintain robust and efficient systems that support our machine learning workflows. This position offers an exciting opportunity to work on cutting-edge technologies and make a significant impact on our organization's success. Responsibilities : Design, implement, and maintain scalable and reliable machine learning infrastructure. Collaborate with data scientists and machine learning engineers to deploy and manage machine learning models in production. Develop and maintain CI/CD pipelines for machine learning workflows. Monitor and optimize the performance of machine learning systems and infrastructure. Implement and manage automated testing and validation processes for machine learning models. Ensure the security and compliance of machine learning systems and data. Troubleshoot and resolve issues related to machine learning infrastructure and workflows. Document processes, procedures, and best practices for machine learning operations. Stay up-to-date with the latest developments in MLOps and related technologies. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Site Reliability Engineer (SRE) or in a similar role. Strong knowledge of machine learning concepts and workflows. Proficiency in programming languages such as Python, Java, or Go. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with containerization technologies like Docker and Kubernetes. Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Strong problem-solving skills and the ability to troubleshoot complex issues. Excellent communication and collaboration skills. Preferred Qualifications: Master's degree in Computer Science, Engineering, or a related field. Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn. Knowledge of data engineering and data pipeline tools such as Apache Spark, Apache Kafka, or Airflow. Experience with monitoring and logging tools such as Prometheus, Grafana, or ELK stack. Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible. Experience with automated testing frameworks for machine learning models. Knowledge of security best practices for machine learning systems and data. Minimum Qualifications Master's / Bachelor's Level Degree and related work experience of 2 years
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are a skilled Java Developer with over 5 years of experience in building integration solutions using Apache Camel and Apache Kafka. Your main responsibility will be designing, developing, and maintaining scalable and reliable backend systems. Your key responsibilities include designing and developing robust Java applications and microservices, implementing and maintaining integration flows using Apache Camel, and building and managing real-time data pipelines using Apache Kafka. You will collaborate with cross-functional teams to deliver high-quality solutions, ensure code quality through unit testing, integration testing, and code reviews, and optimize application performance and scalability. To excel in this role, you must possess strong programming skills in Java (version 8 or above), hands-on experience with Apache Camel for integration and routing, solid knowledge of Apache Kafka for real-time messaging and streaming, experience with REST APIs, Spring Boot, and microservices architecture, familiarity with CI/CD tools, version control (Git), and agile methodologies, as well as excellent problem-solving and communication skills.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Java Full Stack Lead, your primary responsibility will be to lead the development, architecture, and delivery of scalable, cloud-native enterprise applications. Your expertise in backend technologies using Java and Spring Boot, along with proficiency in modern frontend frameworks like Angular or React, will be crucial for success in this role. You will be expected to have a strong foundation in microservices, event-driven systems using Apache Kafka, and deployment in containerized environments on AWS. In this role, you will lead full-stack development efforts across backend, frontend, and integration layers. Designing and implementing robust, scalable microservices using Java 8+, Spring Boot 3+, and REST APIs will be a key part of your responsibilities. You will also be required to architect and maintain event-driven systems using Apache Kafka for real-time data pipelines and asynchronous communication, as well as oversee containerization and orchestration workflows using Docker and Kubernetes. Building intuitive and responsive UI components using Angular (v2+) or React.js, and integrating with RESTful services will be essential tasks. Collaboration with cross-functional teams (DevOps, QA, Product Management) in an Agile environment to ensure timely delivery of high-quality software is also expected. Additionally, optimizing application performance, scalability, and security across the stack and mentoring a team of developers across the software development lifecycle are integral aspects of this role. Your technical skills should include expertise in Java 8+, Spring Boot 3+, REST APIs, Microservices Architecture, Apache Kafka, SQL and NoSQL databases, Maven or Gradle for build automation, Angular (v2+) or React.js, Node.js, HTML5, CSS3, JavaScript, jQuery, XML, Docker, Kubernetes, and basic knowledge of AWS services like EC2, S3, IAM, and ECS. Strong problem-solving skills, experience applying design patterns, familiarity with Git, CI/CD tools like Jenkins or GitLab CI, and test-driven development with automated testing frameworks are also essential for success in this position. By adhering to best practices around design patterns, clean code principles, code reviews, and continuous integration, you will contribute to the optimization and efficiency of software engineering practices within the team.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough