Home
Jobs

1245 Elasticsearch Jobs - Page 39

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Greater Ahmedabad Area

On-site

Linkedin logo

Sr. Fullstack Developer Experience: 4 - 8 Years Exp Salary : Competitive Preferred Notice Period: Within 30 Days Shift: 10:00AM to 7:00PM IST Opportunity Type: Onsite (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python , Python Programming Attri (One of Uplers' Clients) is Looking for: Senior DevOps Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About Attri Attri is an AI organization that helps businesses initiate and accelerate their AI efforts. We offer the industry’s first end-to-end enterprise machine learning platform, empowering teams to focus on ML development rather than infrastructure. From ideation to execution, our global team of AI experts supports organizations in building scalable, state-of-the-art ML solutions. Our mission is to redefine businesses by harnessing cutting-edge technology and a unique, value-driven approach. With team members across continents, we celebrate diversity, curiosity, and innovation. About The Role: We are a global team with our people spread out across different countries. We strive to build a diverse team of passionate people who believe in bringing change through their work. At Attri, we are seeking a talented Frontend Engineer to join our dynamic team. We are a cutting-edge company, and we're looking for an individual who is passionate, inquisitive, and a self-learner, to contribute to the success of our projects. Responsibilities: Modern Web Development: Proficiency in HTML5, CSS3, ES6+, Typescript, and Node.js, with a strong emphasis on staying up-to-date with the latest technologies. TypeScript: Hands on with Generics, Template Literals, Mapped Types, Conditional Types Flexible Approach: Based on problem at hand apply appropriate solution while considering all the risks Frontend React.js and Flux Architecture: Extensive experience in React.js and Flux Architecture, along with external state management to build robust and performant web applications. JS Event Loop: Understanding of event loop, criticality of not blocking main thread, cooperative scheduling in react. State Management: Hands on with more than one state management library Ecosystem: Ability to leverage the vast JS ecosystem and hands on with non-typical libraries. Backend SQL - Extensive hands on with Postgres with comfortable with json_agg, json_build_object, WITH CLAUSE, CTE, View/Materialized View, Transactions Redis - Hands-on with different data structures and usage. Architectural Patterns - Backend for Frontend, Background Workers, CQRS, Event Sourcing, Orchestration/Choreography, etc Transport Protocols, such as HTTP(S), SSE, and WS(S), to optimize data transfer and enhance application performance Serialization Protocols - JSON and at least one more protocol Authentication/Authorization - Comfortable with OAuth, JWT and other mechanisms for different use cases Comfortable with reading open source code of libraries in use and understanding of internals Able to fork the library to either improve, fix bug, or redesign Tooling: Knowledge of essential frontend tools like Prettier, ESLint, and Conventional Commit to maintain code quality and consistency. Dependency management and versioning Familiarity with CI/CD Testing: Utilize Jest/Vitest and React Testing Library for comprehensive testing of your code, ensuring high code quality and reliability. Collaboration: Collaborate closely with our design team to craft responsive and themable components for data-intensive applications, ensuring a seamless user experience. Programming Paradigms: Solid grasp of both Object-Oriented Programming and Functional Programming concepts to create clean and maintainable code. Design/Architectural Patterns: Identifying suitable design and architectural pattern to solve the problem at hand. Comfortable with tailoring the pattern to fit the problem optimally Modular and Reusable Code: Write modular, reusable, and testable code that enhances codebase maintainability. DSA: Basic understanding of DSA when required to optimize hot paths. Good To Have: Python: Django Rest Framework, Celery, Pandas/Numpy, Langchain, Ollama Storybook: Storybook to develop components in isolation, streamlining the UI design and development process. Charting and Visualization: Experience with charting and visualization libraries, especially ECharts by Apache, to create compelling data representations. Tailwind CSS: Understanding of Tailwind CSS for efficient and responsive UI development. NoSQL Stores - ElasticSearch, Neo4j, Cassandra, Qdrant, etc. Functional Reactive Programming RabbitMQ/Kafka Great To Have: Open Source Contribution: Experience in contributing to open-source projects (not limited to personal projects or forks) that showcases your commitment to the development community. Renderless/Headless React Components: Developing renderless or headless React components to provide flexible and reusable UI solutions. End-to-End Testing: Experience with Cypress or any other end-to-end (E2E) testing framework, ensuring the robustness and quality of the entire application. Deployment: Being target agnostic and understanding the nuances of application in operation. What You Bring: Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of relevant experience in frontend web development, including proficiency in HTML5, CSS3, ES6+, Typescript, React.js, and related technologies. Solid understanding of Object-Oriented Programming, Functional Programming, SOLID principles, and Design Patterns. Proven experience in developing modular, reusable, and testable code. Prior work on data-intensive applications and collaboration with design teams to create responsive and themable components. Experience with testing frameworks like Jest/Vitest and React Testing Library. Benefits : Competitive Salary 💸 Support for continual learning (free books and online courses) 📚 Leveling Up Opportunities 🌱 Diverse team environment 🌍 How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Attri, an AI organization, leads the way in enterprise AI, offering advanced solutions and services driven by AI agents and powered by Foundation Models. Our comprehensive suite of AI-enabled tools drives business impact, enhances quality, mitigates risk, and also helps unlock growth opportunities. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

We are accepting applications for experienced Data Engineer with a strong background in data scraping, cleaning, transformation, and automation. The ideal candidate will be responsible for building robust data pipelines, maintaining data integrity, and generating actionable dashboards and reports to support business decision-making. Key Responsibilities: Develop and maintain scripts for scraping data from various sources including APIs, websites, and databases. Perform data cleaning, transformation, and normalization to ensure consistency and usability across all data sets. Design and implement relational and non-relational data tables and frames for scalable data storage and analysis. Build automated data pipelines to ensure timely and accurate data availability. Create and manage interactive dashboards and reports using tools such as Power BI, Tableau, or similar platforms. Write and maintain data automation scripts to streamline ETL (Extract, Transform, Load) processes. Ensure data quality, governance, and compliance with internal and external regulations. Monitor and optimize the performance of data workflows and pipelines. Qualifications & Skills: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Minimum of 5 years of experience in a data engineering or similar role. Proficient in Python (especially for data scraping and automation), and strong hands-on experience with Pandas, NumPy , and other data manipulation libraries. Experience with web scraping tools and techniques (e.g., BeautifulSoup, Scrapy, Selenium). Strong SQL skills and experience working with relational databases (e.g., PostgreSQL, MySQL) and data warehouses (e.g., Redshift, Snowflake, BigQuery). Familiarity with data visualization tools like Power BI, Tableau, or Looker. Knowledge of ETL tools and orchestration frameworks such as Apache Airflow, Luigi, or Prefect . Experience with version control systems like Git and collaborative platforms like Jira or Confluence . Strong understanding of data security, privacy , and governance best practices. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with cloud platforms such as AWS, GCP, or Azure. Familiarity with NoSQL databases like MongoDB, Cassandra, or Elasticsearch. Understanding of CI/CD pipelines and DevOps practices related to data engineering. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

We are looking for a Senior Java Backend Developer with expertise in Spring Boot, microservices, and cloud technologies. The ideal candidate will have experience in building scalable web applications, working with AWS, and integrating with third-party services like Kafka and RabbitMQ. Key Responsibilities: Design, develop, and maintain backend solutions using Java and Spring Boot. Build and manage microservices and distributed systems. Integrate with third-party services (e.g., OAuth, cloud APIs, message brokers like Kafka and RabbitMQ). Ensure system performance, scalability, and reliability through proper design and architecture. Work with MongoDB and other databases to manage and optimize data storage. Deploy and manage services on AWS cloud infrastructure. Collaborate with front-end teams and stakeholders to deliver high-quality web applications. Participate in the SDLC, ensuring best practices are followed in code development and testing. Required Skills: 4+ years of experience in Java backend development. Proficiency in Spring Boot and RESTful APIs. Experience with microservices architecture and AWS. Hands-on experience with Kafka, RabbitMQ, and MongoDB. Familiarity with the full SDLC and agile methodologies. Strong problem-solving skills and ability to optimize system performance. Preferred Skills: Knowledge of ElasticSearch, Solr, Docker, and CI/CD pipelines. Familiarity with security best practices for web apps. About Us: Entire Globe Allied Pvt. Ltd provides services in the IT & BPO industry. Offering our services globally and connecting all to the world of innovation. We believe in providing the best solutions to our clients keeping Customer’s satisfaction and brand’s reputation in mind. We are in the business of outsourcing services, providing complete business solutions for Start-ups, Small and Medium Businesses and currently expanding reach towards large enterprises. We at EG Allied engage ourselves with innovative ideas to get competitive advantage over global competition. Contact Us: E-Mail: info@egallied.com Website: www.egallied.com To know more about us visit our website Show more Show less

Posted 3 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Location: Telangana Hyderabad § Bachelor’s or master’s degree in computer science, Information Systems, Engineering or equivalent from a recognized College / University § 5 to 8 years of work experience as a Java Software Engineer/Java Developer, or similar role. § Have good knowledge of the Java programming language, Spring framework and Spring Boot. § Strong in OOP fundamentals and design using proven design patterns. § Java full stack development experience is required. § Experience in microservices architecture. § Hands on experience with messaging (Azure Event Hub+ Rabbit). § Working knowledge of unit testing frameworks such as JUnit and Mockito. § Experience with Elasticsearch § Experience of test automation tools such as Selenium. § Experience with SQL and NoSQL Databases (Preferably Experience in Postgres and MongoDB). § Experience with Javascript + React. § Knowledge of DevOps and CI/CD, automated test and build tools. § Source Control: GIT, Bitbucket. § Continuous Integration: Bamboo. § Containers: Docker, Mesosphere. § Experience or exposure to cloud environments, specifically Azure + AWS. § Ability to work across waterfall, agile, and hybrid methodologies. Roles & Responsibilities: § Provides input for the prioritization of issues in the backlog and autonomously pulls issues or supports other team members as appropriate. § Understands functional and technical requirements of software components. § investigate, troubleshoot, and provide expert solutions to complex technical issues. § Participate in code reviews, ensure code quality, and conform to best practices and industry standards. § Clearly understands and communicates the impact of changes in the team’s deliverables on other teams and customers. § Provide assistance to junior developers. § Strong communication skills and ability to troubleshoot and debug applications and strive to improve the overall product by researching alternative ways and technologies to achieve the overall goal. § Design and implement Java applications that fulfill requirements. § Create well-written code that runs efficiently and optimally. § Test completed software and debug as necessary. § Examine existing code and recommend patches, design solutions or fixes for broken code. Nice to Haves: § Recommended Certifications: JAVA. Job Type: Full-time Pay: From ₹1,341,200.87 per year Location Type: In-person Schedule: Day shift Experience: Java Developer: 5 years (Preferred) Work Location: In person Speak with the employer +91 7400196230

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Chennai,Tamil Nadu,India Job ID 766065 Join our Team About this opportunity: Senior Devops Engineer will Implement DevSecOps best practices in pipeline management and cloud access control. Work cross functionally with Software engineers, broadcast teams and operations to align on technical requirements. What you will do: Design and build automated pipelines for media ingest processing and distribution. Implement and maintain CI/CD workflows tailored for media-centric applications and services. Architect and manage scalable cloud infrastructure (AWS) for high-availability media pipelines. Working on ways to automate and improve development and release processes. Implement robust monitoring/logging to ensure system reliability and performances. Mentor junior engineers and help establish best practices for DevOps in media environments. Testing and examining code written by others and analyzing results. Develop internal tools, scripts (Java, Python, Bash, Node.js) and use Cloud formation or similar tools to streamline media integration tasks and infrastructure as code (IaC). Able to assist/perform software upgrades/migrations in project. Maintain comprehensive documentation of pipelines, architectures and integration touchpoints in confluence. Provide reports and analysis on cost optimization, system performance- optional. Provide training sessions and documentation to operations and support teams for new solutions. The skills you bring: 5-9 Years of relevant experience in IT Industry & Bachelor’s degree in computer engineering/information technology or equivalent. Strong AWS services knowledge (EC2, S3, Lambda, RDS, etc.). Expertise in CI/CD pipelines (Jenkins, Sonar, Git etc.). Proficiency in container technologies, with a focus on Kubernetes. Experience with serverless, Kafka, Elasticsearch. Strong programming skills in Python or scripting languages. Experience with monitoring & logging tools (CloudWatch, ELK). Hands-on experience with database administration and tuning like graph DB, Dynamo DB. Good to have: Understanding of various IP networking and common protocols such as FTP, SFTP… Knowledge of broadcast video formats, protocols, and encoding standards. Knowledge on Broadcast media terminology. What happens once you apply? find all you need to know about what our typical hiring process looks like. We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Chennai Job details: Developer Primary Recruiter: Hina Yadav

Posted 3 weeks ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Job Description We are looking for a skilled DevOps Engineer to join our team at a client location in Delhi. This role is ideal for someone with hands-on experience in CI/CD implementation, infrastructure management, and automation within complex enterprise environments. If you're passionate about reliability, scalability, and performance, we'd love to hear from you. Key Responsibilities Maintain and manage software feature branches using a Source Control System. Develop tools, scripts, and procedures for automated build, deployment, and monitoring. Manage web applications, data loading processes, and performance monitoring. Understand client requirements and provide immediate DevOps support. Collaborate with Development and QA teams to ensure product readiness. Act as Release Manager: oversee code deployment across environments. Provision and manage infrastructure (servers, storage, networking) as per application needs. Deploy updates, fixes, and optimize application reliability and client experience. Ensure adherence to security, network, and compliance standards. Technical Skills (Mandatory) Operating Systems: Windows, Linux (Ubuntu, CentOS) Web Servers: JBoss, Tomcat Databases: Oracle, MySQL Big Data Tools: Cassandra, Elasticsearch, Hadoop, Spark, Logstash, R Programming & Scripting: Python, Bash Containerization & Orchestration: Docker, Kubernetes CI/CD: Clear understanding and hands-on implementation Preferred Skills Exposure to Cloud Platforms: Azure or GCP Strong troubleshooting and incident management capabilities Familiarity with Infrastructure as Code (IaC) tools is a plus Qualifications B.E / MCA / M.Sc / BCA from a recognized institute Valid DevOps Certification is mandatory (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Key Responsibilities : Algorithm & Model Leadership : Lead the design, development, and implementation of state-of-the-art NLP algorithms and models, with a strong focus on Transformers and similar architectures. Data Quality Assurance : Ensure exceptional data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping. Independent Project Management : Manage your own process by identifying and executing high-impact projects, triaging external requests, and ensuring projects reach conclusion in a timely manner for useful results. Technical Communication : Communicate complex proposals and results clearly and effectively, backed by data, and coupled with actionable conclusions to drive critical business decisions. Collaboration : Work closely with a team of highly skilled and motivated data scientists and machine learning Skills & Qualifications : Experience : Must have 4+ years of industry experience in data science. Machine Learning Fundamentals : Solid understanding of machine learning fundamentals, and familiarity with standard algorithms and techniques. Statistical Computing : Expert knowledge of a statistical computing language such as Python. Probability & Statistics : Strong knowledge of probability and statistics, including experimental design, predictive modeling, optimization, and causal inference. Deep Learning Frameworks : Good knowledge of Deep Learning frameworks like PyTorch and Tensorflow is a must. NLP Expertise : Proven experience in designing, developing, and implementing state-of-the-art NLP algorithms and models using Transformers and similar architectures. Data Quality Focus : Demonstrated ability to ensure data quality across all stages of data acquisition and processing. Problem-Solving : Ability to identify and execute on high-impact projects and bring them to conclusion. Communication : Excellent written and verbal technical communication skills. Team Player : A strong team player who collaborates effectively. Education : Bachelor of Technology (B.Tech) or Bachelor of Science (B.S.) degree (or equivalent) in computer science, engineering, or a relevant : Track-record of having developed novel algorithms, e.g., publications in one or more of the following top tier conferences : KDD, WWW, NIPS, ISWC, NAACL, ACL, SIGIR, EMNLP, ICML. Expertise in building and fine-tuning LLM (Large Language Model) models using Transformers and RAG (Retrieval-Augmented Generation) systems. Good understanding of MLOps tools/processes like ElasticSearch, Jenkins, Docker. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description We are looking for a highly capable Senior Full Stack engineer to be a core contributor in developing our suite of product offerings. If you love working on complex problems, and writing clean code, you will love this role. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze semi-structured data from different sources (20 million+ products from 500+ websites into our catalog of 500 million+ products). We help our customers discover new patterns in their data that can be leveraged so that they can become more competitive and increase their revenue. Essential Functions Think like our customers – you will work with product and engineering leaders to define intuitive solutions Designing customer-facing UI and back-end services for various business processes. Developing high-performance applications by writing testable, reusable, and efficient code. Implementing effective security protocols, data protection measures, and storage solutions. Improve the quality of our solutions – you will hold yourself and your team members accountable to writing high quality, well-designed, maintainable software Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Guiding and mentoring others on the team Technologies We Use Languages: NodeJS/NestJS/Typescript, SQL, React/Redux, GraphQL Infrastructure: AWS, Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Databases: Postgres, MongoDB, Redis, Elasticsearch, Trino, Iceberg Streaming and Queuing: Kafka, NATS, Keda Qualifications 6+ years of professional software engineering/development experience. Proficiency with architecting and delivering solutions within a distributed software platform Full stack engineering experience, including front end frameworks (React/Typescript, Redux) and backend technologies such as NodeJS/NestJS/Typescript, GraphQL Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Proven ability to work and effectively, prioritize and organize your work in a highly dynamic environment Proven track record of working in highly distributed event driven systems. Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Trino, etc.) Solid understanding of Data Pipeline and Workflow Automation – orchestration tools, scheduling and monitoring Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of Data Lakes, Data Warehouses, and modeling practices (Data Vault, etc.) and experience leveraging data lake solutions (e.g. AWS Glue, DBT, Trino, Iceberg, etc.) . Ability to clean, transform, and aggregate data using SQL or scripting languages Ability to design and estimate tasks, coordinate work with other team members during iteration planning Solid understanding of AWS, Linux and infrastructure concepts Track record of lifting and challenging teammates to higher levels of achievement Experience measuring, driving and improving the software engineering process Good testing habits and strong eye for quality. Outstanding organizational, communication, and relationship building skills conducive to driving consensus; able to work well in a cross-functional environment Experience working in an agile team environment Ownership – feel a sense of personal accountability/responsibility to drive execution from start to finish. Drive adoption of Wiser's Product Delivery organization principles across the department. Bonus Points Experience with CQRS Experience with Domain Driven Design Experience with C4 modeling Experience working within a retail or ecommerce environment Experience with AI Coding Agents (Windsurf, Cursor, Claude, ChatGPT, etc) – Prompt Engineering Why Join Wiser Solutions? Work on an industry-leading product trusted by top retailers and brands. Be at the forefront of pricing intelligence and data-driven decision-making. A collaborative, fast-paced environment where your impact is tangible. Competitive compensation, benefits, and career growth opportunities. Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Chennai

Work from Office

Naukri logo

**Position Overview:** We are seeking an experienced AWS Cloud Engineer with a robust background in Site Reliability Engineering (SRE). The ideal candidate will have 3 to 6 years of hands-on experience managing and optimizing AWS cloud environments with a strong focus on performance, reliability, scalability, and cost efficiency. **Key Responsibilities:** * Deploy, manage, and maintain AWS infrastructure, including EC2, ECS Fargate, EKS, RDS Aurora, VPC, Glue, Lambda, S3, CloudWatch, CloudTrail, API Gateway (REST), Cognito, Elasticsearch, ElastiCache, and Athena. * Implement and manage Kubernetes (K8s) clusters, ensuring high availability, security, and optimal performance. * Create, optimize, and manage containerized applications using Docker. * Develop and manage CI/CD pipelines using AWS native services and YAML configurations. * Proactively identify cost-saving opportunities and apply AWS cost optimization techniques. * Set up secure access and permissions using IAM roles and policies. * Install, configure, and maintain application environments including: * Python-based frameworks: Django, Flask, FastAPI * PHP frameworks: CodeIgniter 4 (CI4), Laravel * Node.js applications * Install and integrate AWS SDKs into application environments for seamless service interaction. * Automate infrastructure provisioning, monitoring, and remediation using scripting and Infrastructure as Code (IaC). * Monitor, log, and alert on infrastructure and application performance using CloudWatch and other observability tools. * Manage and configure SSL certificates with ACM and load balancing using ELB. * Conduct advanced troubleshooting and root-cause analysis to ensure system stability and resilience. **Technical Skills:** * Strong experience with AWS services: EC2, ECS, EKS, Lambda, RDS Aurora, S3, VPC, Glue, API Gateway, Cognito, IAM, CloudWatch, CloudTrail, Athena, ACM, ELB, ElastiCache, and Elasticsearch. * Proficiency in container orchestration and microservices using Docker and Kubernetes. * Competence in scripting (Shell/Bash), configuration with YAML, and automation tools. * Deep understanding of SRE best practices, SLAs, SLOs, and incident response. * Experience deploying and supporting production-grade applications in Python (Django, Flask, FastAPI), PHP (CI4, Laravel), and Node.js. * Solid grasp of CI/CD workflows using AWS services. * Strong troubleshooting skills and familiarity with logging/monitoring stacks.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary Person at this position has gained significant work experience to be able to apply their knowledge effectively and deliver results. Person at this position is also able to demonstrate the ability to analyse and interpret complex problems and improve change or adapt existing methods to solve the problem. Person at this position regularly interacts with interfacing groups / customer on technical issue clarification and resolves the issues. Also participates actively in important project/ work related activities and contributes towards identifying important issues and risks. Reaches out for guidance and advice to ensure high quality of deliverables. Person at this position consistently seek opportunities to enhance their existing skills, acquire more complex skills and work towards enhancing their proficiency level in their field of specialisation. Works under limited supervision of Team Lead/ Project Manager. Roles & Responsibilities Responsible for design, coding, testing, bug fixing, documentation and technical support in the assigned area. Responsible for on time delivery while adhering to quality and productivity goals. Responsible for adhering to guidelines and checklists for all deliverable reviews, sending status report to team lead and following relevant organizational processes. Responsible for customer collaboration and interactions and support to customer queries. Expected to enhance technical capabilities by attending trainings, self-study and periodic technical assessments. Expected to participate in technical initiatives related to project and organization and deliver training as per plan and quality. Education and Experience Required Engineering graduate, MCA, etc Experience: 2-5 years Competencies Description Data engineering TCB is applicable to one who 1) Creates databases and storage for relational and non-relational data sources 2) Develops data pipelines (ETL/ ELT) to clean , transform and merge data sources into usable format 3) Creates reporting layer with pre-packaged scheduled reports , Dashboards and Charts for self-service BI 4) Has experience on cloud platforms such as AWS, Azure , GCP in implementing data workflows 5) Experience with tools like MongoDB, Hive, Hbase, Spark, Tableau, PowerBI, Python, Scala, SQL, ElasticSearch etc. Platforms- AWS, Azure , GCP Technology Standard- NA Tools- MongoDB, Hive, Hbase, Tableau, PowerBI, ElasticSearch, Qlikview Languages- Python, R, Spark,Scala, SQL Specialization- DWH, BIG DATA ENGINEERING, EDGE ANALYTICS

Posted 3 weeks ago

Apply

5.0 - 8.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary Person at this position takes ownership of a module and associated quality and delivery. Person at this position provides instructions, guidance and advice to team members to ensure quality and on time delivery. Person at this position is expected to be able to instruct and review the quality of work done by technical staff. Person at this position should be able to identify key issues and challenges by themselves, prioritize the tasks and deliver results with minimal direction and supervision. Person at this position has the ability to investigate the root cause of the problem and come up alternatives/ solutions based on sound technical foundation gained through in-depth knowledge of technology, standards, tools and processes. Person has the ability to organize and draw connections among ideas and distinguish between those which are implementable. Person demonstrates a degree of flexibility in resolving problems/ issues that atleast to in-depth command of all techniques, processes, tools and standards within the relevant field of specialisation. Roles & Responsibilities Responsible for requirement analysis and feasibility study including system level work estimation while considering risk identification and mitigation. Responsible for design, coding, testing, bug fixing, documentation and technical support in the assigned area. Responsible for on time delivery while adhering to quality and productivity goals. Responsible for traceability of the requirements from design to delivery Code optimization and coverage. Responsible for conducting reviews, identifying risks and ownership of quality of deliverables. Responsible for identifying training needs of the team. Expected to enhance technical capabilities by attending trainings, self-study and periodic technical assessments. Expected to participate in technical initiatives related to project and organization and deliver training as per plan and quality. Expected to be a technical mentor for junior members. Person may be given additional responsibility of managing people based on discretion of Project Manager. Education and Experience Required Engineering graduate, MCA, etc Experience: 5-8 years Competencies Description Data engineering TCB is applicable to one who 1) Creates databases and storage for relational and non-relational data sources 2) Develops data pipelines (ETL/ ELT) to clean , transform and merge data sources into usable format 3) Creates reporting layer with pre-packaged scheduled reports , Dashboards and Charts for self-service BI 4) Has experience on cloud platforms such as AWS, Azure , GCP in implementing data workflows 5) Experience with tools like MongoDB, Hive, Hbase, Spark, Tableau, PowerBI, Python, Scala, SQL, ElasticSearch etc. Platforms- AWS, Azure , GCP Technology Standard- NA Tools- MongoDB, Hive, Hbase, Tableau, PowerBI, ElasticSearch, Qlikview Languages- Python, R, Spark,Scala, SQL Specialization- DWH, BIG DATA ENGINEERING, EDGE ANALYTICS

Posted 3 weeks ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

The Tech Lead will oversee the company's technical aspects across various projects such as creating system architecture, analyzing requirements, designing solutions, identifying risks and creating plans. The Tech Lead should be able to work with the team and inspire them to reach their goals. The Tech Lead will be involved in development using Node Js and No Code / Low Code platforms. Responsibilities : - Prepare system architecture diagrams, design databases and prepare solutions for multiple products. - Coordinating with development teams, both internal & external to determine application requirements. - Assessing and prioritizing project requirements, feature requests and developing work schedules for the team. - Delegating tasks and achieving daily, weekly, and monthly goals. - Liaising with team members, management, and clients to ensure projects are completed to the highest quality standards. - Identifying risks and forming contingency plans as soon as possible. - Updating work schedules and performing troubleshooting as required. - Motivating staff and creating a space where they can ask questions and voice their concerns. - Being transparent with the team about challenges, failures, and successes. - Writing progress reports and delivering presentations to the relevant stakeholders. - Reviewing code written by team members to ensure it is scalable, follows standards and best practices. - Writing scalable code using Node JS and No Code / Low Code platforms. - Testing and debugging applications. - Reprogramming existing databases to improve functionality. - Keeping up-to-date with industry trends and developments. Requirements Must Have : - Master's or Bachelor's degree in computer science, computer engineering or related field. - 5+ years of experience as a Node JS developer using Express framework. - Experience in working on projects requiring development of web apps, mobile apps and APIs. - Working experience in multiple databases such as MySQL, Postgres, MongoDB, or DynamoDB. - Experience implementing Redis, MemCache, ElasticSearch and Sockets. - Experience in architecting solutions on AWS, Google Cloud Platform or Azure. - Experience working with load balancers, API gateways. - Excellent communication, motivational, and interpersonal skills. - Strong leadership and organizational abilities. - Team management experience and working with clients. - Excellent technical, diagnostic, and troubleshooting skills. - Adaptability to learn new technologies including No code / Low code platforms. Requirements Good To Have : - Experience with GraphQL. - Experience with DevOps.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone About VOIS India In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Roles & Responsibilities Location – Pune Experiance -5-8 yrs ELK Admin, UNIX Cloud Services Knowledge, Ticketing Tool Knowledge of BMC Remedy, The role purpose is to provide services in the area of 5+ years of hands-on experience with supporting Elasticsearch in production, handling medium to large clusters. Experience with Kibana visualization strategies, controls, and techniques. Experience with Elasticsearch index configuration options, sharding, aliases, etc. Strong experience in filters, Xpack, metrics, cluster management, pipelines. Working knowledge in Remedy or similar ticketing tool India VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Show more Show less

Posted 3 weeks ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Noida, New Delhi, Delhi / NCR

Work from Office

Naukri logo

Responsibilities Develop, maintain, and optimize Python/Django applications. Build and enhance RESTful APIs using Django REST Framework (DRF) or FastAPI . Work with Node.js for microservices or backend integrations. Implement Elasticsearch for efficient search and indexing. Write and optimize complex SQL queries in PostgreSQL . Design and implement scalable architecture and ensure performance improvements. Collaborate with cross-functional teams to understand requirements and provide technical solutions. Debug, troubleshoot, and resolve performance issues in web applications. Write clean, efficient, and maintainable code with proper documentation. Requirements Bachelors/Masters degree in Computer Science, Engineering, or related field. 1-3 years of hands-on experience with Python and Django . Strong understanding of Django ORM, Migrations, and REST API development . Experience with PostgreSQL and writing complex SQL queries . Proficiency in Node.js for backend integrations. Hands-on experience with Elasticsearch for search optimization. Good understanding of object-oriented programming (OOP) and design patterns . Experience working with Git, Docker, and cloud platforms (AWS/GCP) . Excellent problem-solving and debugging skills. Strong verbal and written communication skills. Preferred (Good to Have) Knowledge of Celery, Redis, and task queues . Exposure to message brokers like RabbitMQ or Kafka .

Posted 3 weeks ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities: - Design and develop scalable, resilient, and secure backend services using Java, Springboot, Spring framework and Microservices architecture. - Implement containerized applications using Kubernetes for orchestration and management. - Develop, deploy, and maintain applications on AWS Cloud, ensuring high availability and reliability. - Conduct performance analysis and optimization to improve system efficiency and responsiveness. - Collaborate closely with cross-functional teams including Product Management, UX/UI, and DevOps to deliver end-to-end solutions. - Partner with DevOps teams to operationalize the product deliveries - Technical hands-on experience with Microservices Architecture Style and the related patterns, where software is developed as small and independently deployable services that work together modeled around a business domain. - Manage diverse requirements, negotiate, and effectively articulate rationale and vision behind technical decisions that support the business. - Conduct code reviews, ensure code quality, and enforce coding standards and best practices. - Participate in architectural design discussions and propose solutions to complex technical challenges. - Troubleshoot issues, perform root cause analysis, and implement solutions in a timely manner. - Stay updated on emerging technologies and industry trends, and apply them to improve our products and processes Required Skills and Qualifications: - Proven experience (5+ years) as a Software Development Engineer with expertise in Java, Microservices, Kubernetes, AWS Cloud, and Performance Tuning. - Experience in decouple architecture development using middle ware (eg Kafka) - Exposure to relational and Non relational DBs ( eg Casandra, Elastic, Mongo DB etc) - Strong understanding of software architecture principles, design patterns, and best practices - Proficiency in building RESTful APIs and microservices architecture. - Familiarity with Agile development methodologies and CI/CD pipelines. - Hands-on experience with cloud-native development, CI/CD pipelines, and infrastructure as code (IaC). - Proficiency in troubleshooting and debugging complex issues in distributed systems. - Excellent communication, Analytical skills and ability to work effectively in a collaborative team environment. - Prior experience in [Ecommerce, Retail Domain] is a plus.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

BU/Product Description & Job Overview/Role of the candidate: - Participates in module design with focus paid to the production of high quality, portable, maintainable and BMC standards compliant software; Provides design and requirement inputs to product architect in support of aforementioned goals - A team member who is passionate about quality and demonstrate creativity and innovation in enhancing the product, with excellent problem solving, debugging, analytical and communication skills. - Ability to quickly learn new languages and technologies as required for a successful project delivery. - Works on complex problems where analysis of situations or data requires an in-depth evaluation of various factors. - Critiques the initial problem analysis and ensures that all documentation necessary for problem resolution is available in a timely manner. - Reviews and monitors the problem status data to ensure sufficient back up for the support team. - Coaches junior support team members in handling difficult customer situations. - Well versed in using Github Copilot for increased productivity and accelerated pace of development. - 8+ years of experience - Experience with object-oriented development, experience writing commercial-grade software applications. - Strong knowledge of the following technologies: o Core & Advanced Java (Threading, Design Patterns, Data Structures) J2EE, REST web services o Good DB concept with experience in MS-SQL 2005, MySQL, Postgres, Oracle, MongoDB. o JBOSS, Tomcat Application Server o Build tools: maven and Ant. o Windows, UNIX (LINUX, ubuntu) Operating Systems - Good knowledge or familiar with will be added benefit: o Microservice development and architecture o Spring boot framework o Kubernetes deployment o PostgreSQL, Kafka , Elasticsearch , VictoriaMetrics o Cloud technologies (AWS, OCI, GCP, Azure) o GIT repository o TestTrack , JIRA tools. - Bachelors degree in computer science or related disciplines preferred - Good written and oral communication skills in English. - Experience working in an agile development environment and tools is required

Posted 3 weeks ago

Apply

2.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are looking for an innovative and highly skilled AI Application Engineer to join our team and develop state of-the-art enterprise-scale, public-facing AI applications. This role will provide you with the opportunity to shape the next generation of autonomous agent-based systems, focusing on performance, scalability, and cutting-edge technologies such as LLMs, embedding techniques, and agentic frameworks. As an AI Application Engineer, you will play a critical role in designing and implementing solutions that delight users while optimizing system performance. This position requires a combination of deep technical aptitude, creativity, and a commitment to excellence in application engineering. Requirements Design, develop, and deploy enterprise-scale, public-facing AI applications Implement advanced Retrieval Augmented Generation (RAG) architectures, including hybrid search and multi-vector retrieval Build and optimize systems for token usage, response caching, and performance tuning Develop and maintain autonomous agent frameworks using LangGraph or similar framework Drive innovation in areas like embedding techniques, contextual compression, and multi-agent systems Collaborate with cross-functional teams, including product managers, designers, and DevOps, to ensure robust and user-centric solutions Troubleshoot complex technical challenges and mentor junior developers Stay updated on the latest advancements in LLMs, agentic frameworks, and AI-driven application development Minimum Qualifications: 2-3 years of hands-on experience developing production-grade LLM applications 3 to 5 years of overall software development experience Minimum 1 year of experience in autonomous agent development using the Lang Graph framework or similar Must-Have Skills: Production-Grade LLM Development: Proven experience in developing and deploying large language model applications Autonomous Agent Development: Hands-on experience with frameworks like Lang Graph for building autonomous agents. 3. Advanced RAG Architectures: Expertise in hybrid search, multi-vector retrieval, and contextual compression Prompt Engineering and Vector Search Optimization: Proficiency in crafting effective prompts and optimizing vector searches Performance Tuning and Token Optimization: Strong understanding of response caching and techniques to optimize token usage Nice-to-Have Skills: Scalable Architectures: Experience with Kubernetes, Docker, and cloud platforms, preferably Azure NoSQL Databases: Familiarity with MongoDB and Elasticsearch API-Driven Development: Knowledge of building and maintaining APIs CI/CD Pipelines and Agile Methodologies: Experience with continuous integration/continuous deployment pipelines and Agile practices Analytical and Problem-Solving Skills: Strong analytical abilities and attention to detail Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Job Description : CoinCROWD is an innovative Fintech company. We offer a crypto platform for seamless payments, Crypto Vouchers, crypto trading, portfolio management, real time market data, breaking news and powerful analytics.Domain : Finance, Blockchain, CryptoRole : Permanent full time EmploymentJob Location : Work from Home / RemoteMust Have Experience : Cryptocurrency/Blockchain Job Responsibilities : - Understand project requirements, writing bug free clean code and ensure that the solution works per the agreed architecture, SLAs, KPIs and business model - Integrate backend with third party APIs - 100% Hands on role - Make design decisions that contribute to maintainable systems - Adapt to rapidly evolving requirements and changing priorities and drive the team accordingly - Responsible to drive and support e-commerce project activities and ensure marketplace implementation, deepen customer engagement, satisfaction, and user engagement - Reverse engineer for debugging errors in code and ensuring quality control in the process. - Continually drive products towards a meaningful balance between user needs, business objectives and technical feasibility - Documentation and reference preparation for users by writing operating instructions including changes and revisions. Qualifications : - Bachelor or Master Degree in Computer Science, Software Engineering from a reputed University - 5+ years of experience with at-least 2-3 years working in the Finance/Crypto domain is Plus. - Node, Python , Mysql, Elasticsearch, websockets, javascript, JIRA, Gitlab, Rest API, GCP or AWS - Finance, Social Analytical, debugging, and troubleshooting skill with proven experience troubleshooting and fixing production bugs - Experience in writing unit testing and test case automation. - Ability to operate in an Agile environment with a start-up mentality and unstructured environment, Energy, drive and passion to work, and operate in a digital world. - Excellent communication skills and ability to work with remote teams What We Offer :In recognition of your valuable contributions, you will receive an equity-based compensation package. Join our dynamic and innovative team in the rapidly evolving fintech industry and play a key role in shaping the future of Coincrowd's success. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Develop/Convert the database (Hadoop to GCP) of the specific objects (tables, views, procedures, functions, triggers, etc.) from one database to another database platform Implementation of a specific Data Replication mechanism (CDC, file data transfer, bulk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Analyze discovery and analysis outcomes Lead discovery and analysis workshops/playbacks Identification of the applications dependencies, source, and target database incompatibilities. Analyze the non-functional requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Leads the team to adopt right tools for various migration and modernization method Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise 3 Years + of relevant exp. Strong proficiency in Tableau Desktop and Tableau Server.. Experience with SQL and data manipulation.. Strong experience in data manipulation and SQL Queries Leads the team to adopt right tools for various migration and modernization method Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About PhonePe Group: PhonePe is India’s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Challenges Building for Scale, Rapid Iterative Development, and Customer-centric Product Thinking at each step defines every day for a developer at PhonePe. Though we engineer for a 50million+ strong user base, we code with every individual user in mind. While we are quick to adopt the latest in Engineering, we care utmost for security, stability, and automation. Apply if you want to experience the best combination of passionate application development and product-driven thinking As a Software Engineer: You will build Robust and scalable web-based applications You will need to think of platforms & reuse Build abstractions and contracts with separation of concerns for a larger scope Drive problem-solving skills for high-level business and technical problems. Do high-level design with guidance; Functional modeling, break-down of a module Do incremental changes to architecture: impact analysis of the same Do performance tuning and improvements in large scale distributed systems Mentor young minds and foster team spirit, break down execution into phases to bring predictability to overall execution Work closely with Product Manager to derive capability view from features/solutions, Lead execution of medium-sized projects Work with broader stakeholders to track the impact of projects/features and proactively iterate to improve them As a senior software engineer you must have Extensive and expert programming experience in at least one general programming language (e.g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code. Experience with multi-threading and concurrency programming Extensive experience in object-oriented design skills, knowledge of design patterns, and huge passion and ability to design intuitive module and class-level interfaces Excellent coding skills – should be able to convert the design into code fluently Knowledge of Test Driven Development Good understanding of databases (e.g. MySQL) and NoSQL (e.g. HBase, Elasticsearch, Aerospike, etc) Strong desire to solving complex and interesting real-world problems Experience with full life cycle development in any programming language on a Linux platform Go-getter attitude that reflects in energy and intent behind assigned tasks Worked in a startups environment with high levels of ownership and commitment BTech, MTech, or Ph.D. in Computer Science or related technical discipline (or equivalent). Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with a huge amount of data. 5-7 years of experience in the art of writing code and solving problems on a Large Scale. An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback. As a Software Engineer, good to have The ability to drive the design and architecture of multiple subsystems Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product Understanding of the industry’s coding standards and an ability to create appropriate technical documentation. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news Show more Show less

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Lalpur, Raipur, Chhattisgarh

Remote

Indeed logo

Job Description: ● Understand project requirements, writing bug free clean code and ensure that the ● solution works per the agreed architecture, SLAs, KPIs and business model ● Integrate backend with third party APIs ● 100% Hands on role ● Make design decisions that contribute to maintainable systems ● Adapt to rapidly evolving requirements and changing priorities and drive the team accordingly ● Reverse engineer for debugging errors in code and ensuring quality control in the process. ● Continually drive products towards a meaningful balance between user needs, business objectives and technical feasibility Qualifications: ● Bachelor or Master Degree in Computer Science, Software Engineering from a reputed University ● 2+ years of experience with Django , Django Rest Framework, Python 3, Mysql, Elasticsearch, websockets, javascript, JIRA, Gitlab, Rest API, GCP or AWS ● Experience in writing unit testing and test case automation. ● Ability to operate in an Agile environment with a start-up mentality and unstructured environment, Energy, drive and passion to work, and operate in a digital world. ● Excellent communication skills and ability to work with remote teams ● Deep understanding of Photo Editing industry trends, technology, and customer needs. Job Type: Full-time Pay: ₹400,000.00 - ₹600,000.00 per year Benefits: Health insurance Provident Fund Work from home Location Type: In-person Schedule: Day shift Monday to Friday Ability to commute/relocate: Lalpur, Raipur, Chhattisgarh: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred) Location: Lalpur, Raipur, Chhattisgarh (Preferred) Work Location: In person Speak with the employer +91 9031609758

Posted 3 weeks ago

Apply

4.0 - 7.0 years

12 - 15 Lacs

Pune, Chennai

Work from Office

Naukri logo

We are looking for an experienced ELK Developer with a strong background in Logstash, Elasticsearch, and Kibana. The role involves building dashboards, optimizing queries, and managing ELK stack architecture, including index lifecycle management, shards, and replicas. The candidate must be familiar with Elasticsearch DSL, alerting use cases, role-based access models, and Elastic APIs. Experience with filebeat, metricbeat, DevOps tools (Git, Jenkins, Nexus), and basic SQL querying is required. Strong problem-solving skills, a proactive attitude, and the ability to work both independently and collaboratively are essential.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Chennai G.P.O, Chennai, Tamil Nadu

On-site

Indeed logo

Company : amIT Global Solutions Sdn Bhd (https://www.amitglobal.com/) Duration : 12 Months Contract Payroll under : amIT Global solutions Sdn Bhd Location : Coplace2, Cyberjaya Responsibilities: Design, implement, and optimize ETL pipelines using Elasticsearch to support data ingestion, transformation, and loading. Collaborate with data architects and business stakeholders to understand data requirements and ensure data is structured and available for analysis. Write and maintain complex SQL queries, develop scripts, and build data pipelines to process data efficiently. Implement data transformation logic using Elasticsearch tools and frameworks to ensure data is prepared for analytics. Develop and manage data models for real-time analytics, integrating data from various sources, ensuring consistency and accuracy. Troubleshoot and resolve issues related to data pipelines, performance, and data quality. Work with data science and business intelligence teams to support reporting, dashboards, and data-driven decision-making. Monitor and optimize ETL jobs for scalability, performance, and reliability. Ensure data integrity and compliance with data governance and security standards. Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹100,000.00 - ₹130,000.00 per month Ability to commute/relocate: Chennai G.P.O, Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Wiling to work over Malasyia ? Notice Period ? Experience: Elasticsearch: 3 years (Preferred) ETL: 2 years (Required)

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Chennimalai, Tamil Nadu

On-site

Indeed logo

Company : amIT Global Solutions Sdn Bhd (https://www.amitglobal.com/) Duration : 12 Months Contract Payroll under : amIT Global solutions Sdn Bhd Location : Coplace2, Cyberjaya Responsibilities: Design, implement, and optimize ETL pipelines using Elasticsearch to support data ingestion, transformation, and loading. Collaborate with data architects and business stakeholders to understand data requirements and ensure data is structured and available for analysis. Write and maintain complex SQL queries, develop scripts, and build data pipelines to process data efficiently. Implement data transformation logic using Elasticsearch tools and frameworks to ensure data is prepared for analytics. Develop and manage data models for real-time analytics, integrating data from various sources, ensuring consistency and accuracy. Troubleshoot and resolve issues related to data pipelines, performance, and data quality. Work with data science and business intelligence teams to support reporting, dashboards, and data-driven decision-making. Monitor and optimize ETL jobs for scalability, performance, and reliability. Ensure data integrity and compliance with data governance and security standards. Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹100,000.00 - ₹125,000.00 per month Ability to commute/relocate: Chennai G.P.O, Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Wiling to work over Malasyia ? Notice Period ? Experience: Elasticsearch: 3 years (Preferred) ETL: 2 years (Required)

Posted 3 weeks ago

Apply

Exploring Elasticsearch Jobs in India

Elasticsearch is a powerful search and analytics engine used by businesses worldwide to manage and analyze their data efficiently. In India, the demand for Elasticsearch professionals is on the rise, with many companies seeking skilled individuals to work on various projects involving data management, search capabilities, and more.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities are known for their thriving tech industries and have a high demand for Elasticsearch professionals.

Average Salary Range

The salary range for Elasticsearch professionals in India varies based on experience and skill level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career path in Elasticsearch may involve starting as a Junior Developer, moving on to become a Senior Developer, and eventually progressing to a Tech Lead position. With experience and expertise, one can also explore roles such as Solution Architect or Data Engineer.

Related Skills

Apart from Elasticsearch, professionals in this field are often expected to have knowledge of the following skills: - Apache Lucene - Java programming - Data modeling - RESTful APIs - Database management systems

Interview Questions

  • What is Elasticsearch and how does it differ from traditional databases? (basic)
  • Explain the purpose of an inverted index in Elasticsearch. (medium)
  • How does sharding work in Elasticsearch and why is it important? (medium)
  • What are the different types of queries supported by Elasticsearch? (basic)
  • How can you improve the performance of Elasticsearch queries? (medium)
  • What is the role of analyzers in Elasticsearch? (basic)
  • Explain the concept of mapping in Elasticsearch. (medium)
  • How does Elasticsearch handle scalability and high availability? (medium)
  • What is the significance of the "_source" field in Elasticsearch documents? (basic)
  • How does Elasticsearch handle full-text search? (medium)
  • What is the purpose of the "cluster" in Elasticsearch? (basic)
  • Explain the role of the "query DSL" in Elasticsearch. (medium)
  • How can you monitor the performance of an Elasticsearch cluster? (medium)
  • What are the different types of aggregations supported by Elasticsearch? (medium)
  • How does Elasticsearch handle document versioning? (medium)
  • What are the common data types supported by Elasticsearch? (basic)
  • How can you handle security in Elasticsearch? (medium)
  • Explain the concept of "indexing" in Elasticsearch. (basic)
  • What is the significance of the "refresh" interval in Elasticsearch? (basic)
  • How can you create a backup of an Elasticsearch cluster? (medium)
  • How does Elasticsearch handle conflicts during document updates? (medium)
  • Explain the concept of "relevance" in Elasticsearch search results. (medium)
  • How can you integrate Elasticsearch with other tools or platforms? (medium)
  • What are the key considerations for performance tuning in Elasticsearch? (advanced)

Closing Remark

As you explore job opportunities in Elasticsearch in India, remember to continuously enhance your skills and knowledge in this field. Prepare thoroughly for interviews and showcase your expertise confidently. With the right mindset and preparation, you can excel in your Elasticsearch career and contribute significantly to the tech industry in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies