Jobs
Interviews

6 Elasticsearch Engine Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

10 - 15 Lacs

Noida

Work from Office

Title: Senior Software Engineer/Lead Engineer Elastic Search Desired Experience: 4 - 6 years Required Educational Qualification: B.E. / B.Tech / M.Tech /MCA Job Objective: We are seeking a skilled and experienced Java and Spring Boot Elastic Search Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-performance Java applications with a focus on Elasticsearch integration. The candidate should have a strong background in Java development, along with expertise in implementing and optimizing Elasticsearch solutions. Job Description: Java and Spring Boot Development : Design, develop, and maintain robust and scalable Java applications. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of applications. Elasticsearch Integration: Implement Elasticsearch solutions for efficient data indexing, searching, and retrieval. Develop and optimize Elasticsearch queries to meet performance and scalability requirements. Troubleshoot and resolve issues related to Elasticsearch functionality. Code Review and Optimization: Conduct code reviews to ensure code quality and adherence to best practices. Identify and address performance bottlenecks and optimize code for maximum efficiency. Collaboration and Communication: Work closely with other developers, product managers, and stakeholders to deliver high-quality solutions. Communicate effectively with team members and provide technical guidance as needed. Required Skills: Proven experience in Java development with a minimum of 4 years of hands on experience including 2 years (or 2 recent projects) of strong hands on knowledge with full implementation of Elasticsearch and Spring Boot. Strong knowledge of Spring Boot and its ecosystem. Significant experience in designing and implementing Elasticsearch solutions. Strong expertise in Elasticsearch, including indexing, querying, and performance optimization. Experience with microservices architecture and RESTful API design. Experience with Spring Boot and RabbitMQ. Strong skills in In-memory applications, Database Design, Data Integration. Excellent relationship building and communication skills; ability to interact and work effectively with all level. Mandatory Skills : Proficiency in Java programming language. Proficiency in Spring Boot. Experience with RESTful APIs and web services. Familiarity with relevant tools and frameworks. Strong in any of SQL Database. Strong knowledge of Elasticsearch, including indexing, querying, and performance tuning. Familiarity with GIT & version controlling. Additional Preferred Skills : Experience with containerization technologies (e., Docker, Kubernetes) Knowledge of Micro services & any API Gateway. Knowledge of cloud platforms (e., AWS, Azure, or GCP). Familiarity with S3 bucket. Familiarity with message brokers (e., Rabbit

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Job Title : ELK Stack Developer (Elasticsearch, Logstash, Kibana) Experience : 3+ Years Location : Mumbai / Navi Mumbai (Work from Office) Job Type : Full-time Job Summary : We are seeking a skilled ELK Stack Developer with at least 3 years of hands-on experience in implementing and managing Elasticsearch, Logstash, and Kibana. The ideal candidate will be responsible for developing scalable logging and monitoring solutions, integrating log sources, designing visual dashboards, and optimizing search and analytics capabilities. Key Responsibilities : Develop and maintain scalable ELK (Elasticsearch, Logstash, Kibana) solutions. Design and implement centralized logging and monitoring solutions. Integrate various data sources into Elasticsearch using Logstash or Beats. Create Kibana dashboards and visualizations to monitor systems, applications, and business KPIs. Write efficient Elasticsearch queries and tune clusters for performance and reliability. Handle data parsing, indexing, and transformation using Logstash filters and pipelines. Implement alerting and reporting solutions using Kibana or third-party integrations. Collaborate with DevOps, security, and development teams to define logging requirements. Troubleshoot issues across the ELK stack and ensure high availability. Required Skills : Strong experience in Elasticsearch , Logstash , and Kibana (minimum 3 years). Experience with Beats (Filebeat, Metricbeat) for data ingestion. Strong understanding of Elasticsearch indexing, mapping, query DSL, and performance tuning. Proficiency in creating custom Kibana dashboards and visualizations . Hands-on experience with Logstash pipelines , grok filters, and data enrichment. Understanding of REST APIs and JSON-based data formats. Familiarity with Linux , Bash , and scripting . Knowledge of monitoring tools (e.g., Prometheus, Grafana) is a plus. Experience with cloud platforms (AWS, Azure, GCP) is desirable. Please Share & Refer resume at ajay.kurlekar@cloverinfotech.com

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Develop and maintain Kafka-based data pipelines for real-time processing. Implement Kafka producer and consumer applications for efficient data flow. Optimize Kafka clusters for performance, scalability, and reliability. Design and manage Grafana dashboards for monitoring Kafka metrics. Integrate Grafana with Elasticsearch, or other data sources. Set up alerting mechanisms in Grafana for Kafka system health monitoring. Collaborate with DevOps, data engineers, and software teams. Ensure security and compliance in Kafka and Grafana implementations. Requirements: 8+ years of experience in configuring Kafka, ElasticSearch and Grafana Strong understanding of Apache Kafka architecture and Grafana visualization. Proficiency in .Net, or Python for Kafka development. Experience with distributed systems and message-oriented middleware. Knowledge of time-series databases and monitoring tools. Familiarity with data serialization formats like JSON. Expertise in Azure platforms and Kafka monitoring tools. Good problem-solving and communication skills. Mandate : Create the Kafka dashboards , Python/.NET Note: Candidate must be immediate joiner.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Pune

Work from Office

Job Description: We are looking for a highly skilled and experienced Senior Technical Lead to join our engineering team. In this role, you will lead the design, development, and deployment of scalable backend systems, while mentoring a team of engineers and collaborating with cross-functional stakeholders. Youll play a key role in shaping technical direction, improving system architecture, and ensuring high standards in code quality and performance. Key Responsibilities: 1. Lead the architecture and development of backend systems using Node.js, Python, MySQL, and Elasticsearch. 2. Design and implement scalable microservices deployed on Kubernetes. 3. Provide technical leadership and mentorship to a team of engineers. 4. Collaborate with Product Managers, Designers, and other Tech Leads to deliver impactful solutions. 5. Conduct code reviews, define best practices, and ensure adherence to high-quality engineering standards. 6. Troubleshoot and resolve technical issues, ensuring system reliability and performance. 7. Drive continuous improvement in infrastructure, CI/CD, and system monitoring. Requirements: 1. Strong hands-on experience with Node.js and backend API development. 2. Proficient in MySQL database design, optimization, and query performance. 3. Experience with Elasticsearch for search and analytics use cases. 4. Working knowledge of Python for automation, scripting, or service development. 5. Deep understanding of Kubernetes, containerization, and cloud-native deployments. 6. Excellent problem-solving, communication, and team leadership skills. 7. Experience with distributed systems, high-availability architectures, and system scaling. 5. Prior experience in mentoring or leading a team of developers. Nice to Have: 1. Experience with cloud platforms (AWS, GCP, or Azure). 2. Familiarity with message queues like Kafka or RabbitMQ. 3. Exposure to CI/CD pipelines and infrastructure as code (e.g., Terraform, Helm).

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

Chennai

Work from Office

Infra as Code, CI/CD, system admin, coding, monitoring, security, and cross-team communication. Skills: Docker, K8s, ArgoCD, Ansible, Jenkins, AWS, Linux/macOS, Prometheus, DBs (SQL/NoSQL), Python, Git. Add GitHub/GitLab link in resume.

Posted 1 month ago

Apply

2.0 - 7.0 years

0 - 1 Lacs

Raipur

Work from Office

Job Title: Python Developer Django & AI/ML Location: Raipur, Chhattisgarh Job Type: Full-Time | On-site Job Overview: We are hiring a skilled Python Developer with expertise in Django and AI/ML technologies to join our growing team in Raipur. The ideal candidate will be responsible for developing robust web applications, designing APIs, and implementing intelligent machine learning solutions. Key Responsibilities: Develop scalable web applications using Python and Django Design and build RESTful APIs and integrate with frontend frameworks Work with relational databases (PostgreSQL, MySQL) and ORM tools Implement asynchronous task processing with Celery Develop, train, and deploy machine learning models Handle NLP, computer vision, and AI-driven tasks Visualize data and insights using tools like Matplotlib and Seaborn Collaborate with DevOps for deployment and monitoring Core Requirements: Strong proficiency in Python and object-oriented programming Experience with Django and Django REST Framework Solid understanding of web development principles and MVC architecture Hands-on experience with relational databases and ORM tools Technical Skills: Frontend technologies: HTML5, CSS3, JavaScript, Bootstrap Knowledge of React, Vue.js, or Angular API development using REST or GraphQL Version control using Git (GitHub/GitLab) Testing with pytest, unittest, or Django test framework Asynchronous processing with Celery AI/ML Skills: Machine Learning: scikit-learn, pandas, NumPy Deep Learning: TensorFlow, PyTorch, Keras Data analysis and visualization: pandas, matplotlib, seaborn Experience in model development, training, and deployment Exposure to NLP, Computer Vision, and MLOps Familiarity with Jupyter Notebooks Preferred / Additional Skills: Cloud experience (AWS, Azure, Google Cloud Platform) Docker and Kubernetes for containerization Big Data technologies (Apache Spark, Hadoop) NoSQL databases (MongoDB, Redis, Elasticsearch) CI/CD pipelines and DevOps practices Experience with FastAPI and microservices architecture Why Join Us: Work on innovative AI/ML and intelligent software solutions Collaborative, innovation-driven work culture Exposure to the latest tools and technologies Structured career growth and learning opportunities Competitive salary and benefits package How to Apply: Interested candidates may send their updated resume to career@srfcnbfc.in

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies