Jobs
Interviews

20630 Mongodb Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

10 - 15 Lacs

Pune, Maharashtra, India

On-site

🚀 We’re Hiring – Full Stack Developer (MEAN/MERN) 📍 Location: Pune (On-site/Hybrid as per role) 🧑‍💻 Experience: 5+ Years 💻 Skills: MEAN / MERN Stack, REST APIs, JavaScript, MongoDB, Node.js, Angular/React 📍 Note The selected candidates will be required to work from our Bangalore office for the initial 3 months(Hybrid-12 days) Post this period, they may relocate to and continue working from their home location (Pune/Nagpur) based on performance and project requirements. We’re looking for a passionate Full Stack Developer with solid experience in building scalable web applications using MEAN or MERN stack. 📧 Interested? Send your resume to: mayuri.jain@apex1team.com Skills: rest apis,javascript,mongodb,angular,mern,mern stack,react,reactjs,mean stack,node.js

Posted 3 days ago

Apply

0.0 - 2.0 years

4 - 7 Lacs

Mohali, Punjab

On-site

Job Summary: We are seeking a proactive and technically sound Business Analyst – Pre-Sales with hands-on experience in MEAN/MERN stack-based projects. The ideal candidate will play a dual role in bridging the gap between clients and development teams, gathering and analyzing requirements, preparing detailed documentation, and supporting pre-sales efforts by translating client needs into actionable solutions. Key Responsibilities: Collaborate with potential clients to understand business needs, technical requirements, and project scope, primarily for web and mobile applications built using MEAN/MERN stacks. Conduct discovery sessions and translate high-level business requirements into detailed technical documentation including BRDs, FRDs, SRS , and SoWs . Work closely with the pre-sales team to create proposals, solution briefs, wireframes , and project estimates tailored to client requirements. Engage in technical discussions with internal teams to ensure alignment on architecture, features, timelines, and delivery approach. Support RFP/RFI responses by providing domain knowledge, system workflows, and detailed documentation. Create and present user stories, flow diagrams , and demo scripts to clients and internal stakeholders. Stay updated with MEAN/MERN tech trends to provide recommendations and strategic inputs during client consultations. Ensure traceability of requirements through the SDLC and participate in UAT planning and execution. Work collaboratively with UI/UX teams, developers, QA, and project managers to ensure accurate delivery. Required Skills & Qualifications: 2–5 years of experience in business analysis, with a strong background in pre-sales and client communication . Proven experience working on projects involving MEAN (MongoDB, Express, Angular, Node.js) or MERN (MongoDB, Express, React, Node.js) stacks. Excellent requirement gathering and analysis skills. Strong experience with documentation tools and methodologies (BRD, FRD, SRS, SoW). Ability to convert client ideas into technical solutions and communicate effectively with both technical and non-technical stakeholders. Familiarity with Agile/Scrum methodologies , wireframing tools (e.g., Balsamiq, Draw.io), and project management platforms like JIRA, Trello, or Azure DevOps . Experience supporting or directly involved in pre-sales, proposal writing, and client presentations . Excellent communication, problem-solving, and analytical thinking skills. Job Type: Full-time Pay: ₹400,000.00 - ₹700,000.00 per year Benefits: Paid sick time Provident Fund Schedule: Monday to Friday Experience: Business analysis: 2 years (Required) Location: Mohali, Punjab (Required) Work Location: In person

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

1. Programming Languages & Frameworks (Java 18, Spring Boot) 2. API Development (RESTful APIs, GraphQL, OpenAPI/Swagger) 3. Databases & ORM (PostgreSQL, MySQL, MongoDB, Hibernate, JPA) 4. CI/CD Pipelines (Jenkins, GitLab CI/CD, GitHub Actions) 5. Containerization & Orchestration (Docker, Kubernetes) 6. Cloud Platforms (Azure) 7. Monitoring & Logging (Prometheus, Grafana, ELK Stack, Splunk) 8. Testing Frameworks (JUnit, TestNG, Mockito, WireMock) 9. Messaging & Integration (Kafka, REST, SOAP) 10. Security & Authentication (OAuth2, JWT, Spring Security)

Posted 3 days ago

Apply

9.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Description Your Impact: The Specialist would bring hands-on technological expertise, passion, and innovation to the table. Will be responsible for designing and enabling Application support, and handling Production farms and various Infrastructure platforms for different delivery teams. In the capacity of a subject matter, experts will be responsible as a system architecture to design and build scalable and efficient Infrastructure Platforms At the same time, specialists will also be responsible for establishing best practices, cultivating thought leadership, and developing common practices/ solutions on Infrastructure. Qualifications Your Skills & Experience: 9 to 13 years of experience in DevOps with a bachelor s in engineering/Technology Or master s in engineering/Computer Applications Expertise in DevOps & Cloud tools: Cloud-AWS Version Control (Git, Gitlab, GitHub) Hands-on experience in Container Infrastructure ( Docker, Kubernetes, Hosted solutions) Ability to define container-based environment topology following principles of designing a well-architected framework. Be able to Design and implement advanced aspects using Service Mesh technologies like Istio, Linkerd, Kuma, etc Infrastructure Automation (Chef/Puppet/Ansible, Terraform, ARM, Cloud Formation) Build tools (Ant, Maven, Make, Gradle) Artifact repositories (Nexus, JFrog Artifactory) CI/CD tools on-premises/cloud (Jenkins, TeamCity) Monitoring, Logging, and Security (CloudWatch, cloud trail, log analytics, hosted tools such as ELK, EFK, Splunk, Prometheus, OWASP, SAST, and DAST) Scripting languages: Python, Ant, Bash, and Shell Hands-on experience in designing pipelines & pipelines as code. Hands-on experience in end-to-end deployment process & strategy Good exposure to tools and technologies used in building a container-based infrastructure. Hands-on experience of GCP/AWS/AZURE with a good understanding of computing, networks, IAM, Security, and integration services with production knowledge on Implementing strategies for reliability requirements Ensuring business continuity Meeting performance objectives Security requirements and controls Deployment strategies for business requirements Cost optimization etc Responsible for managing Installation, configuration, automation, performance, monitoring, Capacity planning, and Availability Management of various Servers and Databases. An expert in automation skills Knowledge of load balancing, CDN options provided by multiple cloud vendors (E.g. Load balancer and Application gateway in Azure, ELB, and ALB in AWS) Good knowledge of network algorithms on failover and availability. Capability to write complex code e.g., automation of recurring/mundane tasks, OS administration (CPU, memory, network performance troubleshooting), also demonstrates strong troubleshooting skills Demonstrates HA/DR design on Cloud platform as per SLAs/RTO/RPO Good knowledge of migrations tools available with cloud vendors and independent providers Set Yourself Apart With The capability of estimating the setup time required for Infrastructure and build & release activities. Good Working Knowledge of the Linux Operating System Skill development, knowledge base creation, and toolset optimization of the Practice. Handling Content Delivery Network and Performing root cause analysis. Understanding of any one of DBMS like MySQL, Oracle, or No SQL like Cassandra, MongoDB, etc. Capacity Planning and Infrastructure estimations. Working understanding of scripting in any one of the languages: BASH/Python/Perl/Ruby Certification in any cloud (Architect or Professional) Additional Information Gender-Neutral Policy 18 paid holidays throughout the year. Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

Posted 3 days ago

Apply

0.0 - 2.0 years

0 Lacs

Katargam, Surat, Gujarat

On-site

Benzatine Infotech is a Mobile/Web Development and Information Technology company. We have over 10+ years of programming experience and have developed a wide range of over 200+ iOS and Android mobile and web applications. Our team delivers professional, innovative solutions for our clients -- quickly and cost-effectively. Note: Only Surat, Gujarat Candidate apply for this job. Responsibilities and Duties: Develop, record, and maintain cutting-edge web-based PHP applications on portal plus premium service platforms Meeting both technical and consumer needs. Developing front-end website architecture. Creating servers and databases for functionality. Working alongside graphic designers for web design features Designing and developing APIs. Ability to communicate with clients, understand requirements and execute them independently. Qualifications and Skills: Bachelor/Master degree in Computer Science, Engineering, MIS or a similar relevant field Proven working experience as a Laravel developer. Experience with Frontend Framework like Vue Or ReactJs Hands on experience with SQL schema design, SOLID principles, REST API design Proficiency with fundamental front end Technologies such as HTML5, CSS3, JQuery, Javascript etc. MySQL/MongoDB profiling and query optimization Creative and efficient problem solver Deliver the entire app life cycle – concept, design, build, deploy, test, release to app stores and support. Experience: 2+ years of experience Benefits: 5 Days working 12 Paid leaves + Holidays Festival Celebrations On-Time Salary Career growth opportunities Best place to gain knowledge Friendly environment Co-operative senior developer Location: Surat, Gujarat (On-site) Job Type: Full-time Experience: Laravel: 2 years (Preferred) JavaScript: 2 years (Preferred) Work Location: In person

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Role Overview: We are seeking a highly motivated Senior Developer with 7+ years of experience in Full Stack Development to join our team and help build intelligent, scalable web applications. This role blends software engineering with cutting-edge AI/ML integration, offering an exciting opportunity to work on impactful projects in a fast-paced environment. Key Responsibilities: Design, develop, and maintain robust web applications using Python, Django, and Flask. Integrate AI/ML models into production environments, ensuring performance and scalability. Write high-quality, reusable code and implement best practices for development and deployment. Collaborate with data scientists and engineers to translate business requirements into technical solutions. Debug, test, and optimize applications for improved performance and reliability. Contribute to system architecture discussions and propose enhancements. Mentor junior developers (if applicable) and actively participate in code reviews. Stay abreast of advancements in AI, web frameworks, and related technologies. Required Qualifications: 7+ years of professional experience in Software Development. (with focus in Python preferred) Proven expertise in building applications with Django and Flask frameworks. Hands-on experience with AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn) and model deployment. Strong understanding of RESTful APIs, databases (e.g., PostgreSQL, MongoDB), and ORM tools. Proficiency with version control systems like Git. Ability to work independently and manage tasks with minimal supervision. Excellent problem-solving skills and attention to detail. Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). Preferred Qualifications: Experience with containerization tools like Docker or Kubernetes. Familiarity with cloud platforms (e.g., AWS, GCP, Azure) for deploying applications. Knowledge of front-end technologies (e.g., HTML, CSS, JavaScript) is a plus. Contributions to open-source projects or a strong GitHub portfolio.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Job Description: A highly skilled Database Engineer with 6 years of experience in designing, implementing, and maintaining complex database systems. Proven expertise in performance tuning, data modeling, and ensuring data integrity. Adept at collaborating with cross-functional teams to optimize database functionality and support business objectives. What You'll Do Design, develop, and maintain database architectures, ensuring optimal performance and security. Implement data models and database structures that meet the needs of applications and reporting. Perform database performance tuning, indexing, and query optimization. Manage database backup and recovery processes to prevent data loss. Ensure compliance with data governance policies and security standards. Collaborate with software developers, system architects, and data analysts to support application development and data needs. Monitor database performance and troubleshoot issues to maintain high availability and reliability. Conduct regular database audits and implement improvements based on findings. Stay updated with the latest database technologies and best practices. Skills And Technologies Proficient in SQL, PL/SQL and experience with relational database management systems (RDBMS) such as Oracle, MySQL, PostgreSQL, or SQL Server. Knowledge of NoSQL databases (e.g., MongoDB, Cassandra) and data warehousing solutions. Experience with database migration and upgrade processes. Familiarity with cloud database services (e.g., AWS RDS, Azure SQL Database). Strong understanding of data security practices and regulatory compliance. Scripting skills in languages like Python, Perl, Shell, or PowerShell for automation. Excellent analytical and problem-solving skills. What You Need To Bring Bachelor’s degree in computer science, Information Technology, or a related field. Relevant certifications (e.g., Microsoft Certified: Azure Database Administrator, Oracle Certified Professional) are a plus. Experience 4-6 years of experience in database engineering or a related field. Proven track record of successfully managing large-scale database projects. Soft Skills Strong communication skills for collaboration with technical and non-technical stakeholders. Detail-oriented with a focus on quality and accuracy. Ability to work independently and manage multiple priorities effectively Additional Skills Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, Solutions Design, Testing & Automation, User Experience (UX) What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #networking Job Engineering Job Level TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Role: Data Engineer Experience: 7+ Years Mode: Hybrid Key Responsibilities: • Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation). • Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, computing and processing. • Optimize cloud infrastructure for performance, scalability, and cost-effectiveness. • Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines managing in DevOps. • Create and manage robust Data Warehousing solutions using technologies such as Redshift. • Ensure high data quality and integrity across all pipelines. • Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik. • Collaborate with business stakeholders to define key metrics and deliver actionable insights. • Implement best practices for data encryption, secure data transfer, and role-based access control. • Lead audits and compliance certifications to maintain organizational standards. • Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers. • Mentor junior team members and provide technical guidance for complex projects. • Partner with stakeholders to define and align data strategies that meet business objectives. Qualifications & Skills: • Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack. • Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM. • Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance. • Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DB’s (like DynamoDB, MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming applications preferably in Spark, Kafka, or other streaming platforms. • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, • Encryption, KMS, Secrets Manager. • Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouse’s or Delta Lake. Proficiency in visualization tools such as Tableau, Power BI, or Qlik. • Strong problem-solving skills and ability to debug and optimize application applications for performance. • Strong understanding of Database/SQL for database operations and data management. • Familiarity with CI/CD pipelines and version control systems like Git. • Strong understanding of Agile methodologies and working within scrum teams. Preferred Qualifications: • Bachelor of Engineering degree in Computer Science, Information Technology, or a related field. • AWS Certified Solutions Architect – Associate (required). • Experience with Agile/Scrum methodologies and design sprints.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary Job description : We are looking for an experienced MongoDB Developer with 5+ years of hands on expertise in designing, developing, and optimizing MongoDB based solutions. The ideal candidate will be proficient in NoSQL database , performance tuning, and integration with backend technologies. Key Responsibilities: ¿ Design and implement MongoDB database schemas and structures optimized for scalability and performance ¿ Strong proficiency in MongoDB and NoSQL database management best practices ¿ Develop and optimize queries, indexes, and aggregation pipelines for complex data scenarios ¿ Integrate MongoDB with applications built in Node.js, Python, or Java ¿ Collaborate with backend developers to ensure seamless database interaction and data integrity ¿ Monitor performance and apply database tuning techniques using relevant tools ¿ Optimize read/write operations, especially in high volume and large scale datasets ¿ Implement data security, backup, and replication strategies (including MongoDB Atlas if applicable) ¿ Work with DevOps and CI/CD pipelines to automate database deployment and migration processes ¿ Participate in code reviews, data model validation, and documentation efforts ¿ Strong problem solving, debugging, and analytical thinking skills ¿ Excellent communication and collaboration skills in a team environment ¿ Familiarity with Agile methodologies and version control systems (e.g., Git) Good to Have:

Posted 3 days ago

Apply

0.0 - 3.0 years

16 - 18 Lacs

Bengaluru, Karnataka

On-site

Location ; Bangalore , Electronics City As an experienced Full Stack Developer you will have opportunities to work at all levels of our technology stack, from the customer facing dashboards and back-end business logic, to the high volume data collecting and processing. As a Full Stack Developer you should be comfortable around a range of different technologies and languages, and with the integration of third-party libraries and development frameworks. Work with project stakeholders to understand requirements and ideate software solutions Design client-side and server-side architectures Build front-end applications delivering on usability and performance Build back-end services for scalability and reliability Write effective APIs and build to third-party APIs Adhere to security and data protection standards and requirements Instrument and test software to ensure the highest quality Monitor, troubleshoot, debug and upgrade production systems Write technical documentation REQUIREMENTS Proven experience as a Full Stack Developer or similar role Comfortable with Golang, Scala, Python, and Kafka, or the desire to learn these technologies Experience in front-end web development helping to create customer facing user interfaces; experience with ReactJS a plus Familiarity with databases and data warehousing such as PostgreSQL, MongoDB, Snowflake Familiarity with Amazon Web Services cloud platform Attention to detail, strong organizational skills, and a desire to be part of a team Degree in Computer Science, Engineering, or relevant field Job Types: Full-time, Permanent Pay: ₹1,600,000.00 - ₹1,800,000.00 per year Benefits: Health insurance Paid sick time Provident Fund Ability to commute/relocate: Bangalore, Karnataka: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Electronic city Bangalore are you ok to work in this location Python backend & React JS must Experience: Full-stack development: 3 years (Required) Location: Bangalore, Karnataka (Required) Willingness to travel: 100% (Required) Work Location: In person

Posted 3 days ago

Apply

4.0 years

0 Lacs

New Delhi, Delhi, India

On-site

About Us: We are building the next generation of intelligent developer tools designed to supercharge software development using AI. If you’re passionate about clean code, fast iteration, and the future of coding itself, join us in shaping how modern developers work. Role Overview: We are seeking a talented and enthusiastic Full Stack Developer with 2–4 years of hands-on experience in building scalable applications using modern frontend and backend technologies. Prior exposure to AI-powered development environments, such as GitHub Copilot, Cursor, Tabnine, CodeWhisperer, or similar tools, is highly desirable. Responsibilities: Build, enhance, and maintain web applications across the full stack (frontend to backend). Collaborate with designers, product managers, and AI engineers to build intuitive, performant user experiences. Contribute to architectural discussions and technical decisions. Write clean, modular, and testable code. Stay current with new technologies and advocate for continuous improvement. Key Requirements: 2–4 years of experience as a Full Stack Developer. Strong proficiency in JavaScript/TypeScript, React (or similar), and Node.js/Python/PHP. Experience with databases like PostgreSQL,MYSQL, MongoDB, Firebase. Exposure to AI-enhanced IDEs or developer tools (e.g., Cursor, GitHub Copilot, Tabnine, WindSurf etc.). Understanding of RESTful APIs and modern CI/CD practices. Experience with cloud platforms (AWS or Azure). Familiarity with containers (Docker). Problem Solving skills and strong team player. Nice to have: Exposure to LLM-driven application or retrieval-augmented generation (RAG). Familiarity with Test Driven Development frameworks. Familiarity with Git workflows and modern DevOps tooling.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Location : Kolkata, India (In-office) Type : Full-time Experience : 5+ years About Client Our client is building a next-generation platform to power preschools of the future combining powerful tools for curriculum planning, classroom journaling, communication, and AI-driven insights — all designed to elevate the preschool experience for educators, parents, and school leaders. Built for web, iOS, and Android , it blends world-class engineering with deep domain expertise in early childhood education. We're already trusted by forward-thinking preschools across India and globally, and are just getting started. About The Role We're looking for a seasoned Python backend engineer to join our core engineering team at Agastya – a next-gen EdTech SaaS for preschools. You’ll work closely with the founders and product teams to build scalable, fast, and secure backend services – and power the AI-enabled features that are core to our platform. Key Responsibilities Architect and maintain microservices using Flask and FastAPI Design performant, secure, and well-documented APIs Optimize systems for scalability and low latency Work with GCP services (Cloud Run, Cloud Storage, Firestore, IAM, etc. Maintain existing CI/CD pipelines and observability tools (Sentry, logging, etc.) Collaborate with mobile/web engineers for full-stack delivery Support deployment and scaling of AI/ML features and services Requirements 5+ years of backend development experience in Python Familiar with AI tools and IDEs like Cursor, Claude Code, etc. Deep expertise in Flask and FastAPI frameworks Hands-on experience with Google Cloud Platform (GCP) Strong understanding of performance optimization and async programming Experience with NoSQL databases like MongoDB or Firestore Comfortable with Docker and GitHub Actions Startup mindset: proactive, execution-oriented, and quality-driven Bonus / Preferred Familiarity with foundation model APIs and tools like OpenAI, CoreWeave, Pinecone, LangChain, etc. Exposure to building or integrating AI-driven features in production apps Skills: fastapi,flask,firestore,python,mongodb,sentry,gcp,iam,async programming,github,ci/cd,cloud,github actions,nosql,cloud run,ai tools,cursor,docker,claude code,cloud storage

Posted 3 days ago

Apply

4.0 years

0 Lacs

India

Remote

Job Summary: Job Title: Golang Developer Location: Remote Job Type: Contract Duration: 1.5 Month Job Description: We are seeking a skilled Go (Golang) Developer with strong backend development experience to join our team on a short-term (1.5-month) remote contract. The ideal candidate will be responsible for building and optimizing backend services, managing MongoDB databases, and ensuring performance at scale. Key Responsibilities: Design, develop, and maintain robust and scalable backend services using Go (Golang) Work with MongoDB to build efficient data models and handle data operations Optimize queries and database interactions for performance and scalability Troubleshoot, debug, and resolve performance bottlenecks in backend systems Collaborate with front-end developers and other team members to integrate APIs and services Ensure code quality through best practices, testing, and documentation Required Skills: 4+ years of hands-on experience with Go (Golang) Strong working knowledge of MongoDB including data modeling and query optimization Experience in creating and consuming RESTful APIs and microservices Deep understanding of backend architecture and system performance tuning Ability to write clean, maintainable, and well-documented code Familiarity with version control systems like Git and bitbucket Self-motivated and able to work independently in a remote environment Nice to Have: Exposure to cloud platforms (Azure, AWS, or GCP) Experience with containerization tools like Docker Familiarity with CI/CD pipelines

Posted 3 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

About the Role - Immediate Joiners ONLY apply We are looking for a skilled Java Backend Developer to join our FINTECH engineering team working in product-driven environments. Key Responsibilities Design, develop, and deploy backend services using Java, Spring Boot, and Microservices architecture Build and maintain RESTful APIs with clean, scalable code Work with relational and NoSQL databases ( MySQL, MongoDB/Cassandra ) — including schema design and performance tuning Implement secure, reliable, and scalable server-side logic with a strong focus on authentication, authorization, and application security Integrate with DevOps pipelines using Jenkins , Docker , Kubernetes , Maven , and Git Collaborate with frontend developers, QA, and product managers in an Agile environment Optimize application performance with caching, queuing, and load balancing Must-Have Skills Strong Java/J2EE programming skills with solid understanding of Spring, Spring Boot, Spring Cloud , and Hibernate Hands-on experience in designing and consuming RESTful APIs Good understanding of Object-Oriented Programming and Server-Side Architecture Experience with both MySQL and NoSQL databases with focus on data modeling and tuning Exposure to Docker, Kubernetes , and message queues (e.g., RabbitMQ, Kafka) Experience with CI/CD tools like Jenkins , build tools like Maven , and testing frameworks like JUnit Familiarity with caching mechanisms (e.g., Redis , Memcached ) Preferred Background Experience working in service or product-based companies specially FINTECH Proven track record of delivering scalable backend solutions Strong communication skills, a positive attitude , and a team-oriented mindset

Posted 3 days ago

Apply

0.0 - 5.0 years

1 - 9 Lacs

Mohali, Punjab

On-site

Apptunix is a leading Mobile App & Web Solutions development agency, based out of Texas, US. The agency empowers cutting-edge startups & enterprise businesses, paving the path for their incremental growth via technology solutions. Established in mid-2013, Apptunix has since then engaged in elevating the client’s interests & satisfaction through rendering improved and innovative Software and Mobile development solutions. The company strongly comprehends business needs and implements them by merging advanced technologies with its seamless creativity. Apptunix currently employs 250+ in-house experts who work closely & dedicatedly with clients to build solutions as per their customers' needs. Required Skills: - Deep Experience working on Node.js - Understanding of SQL and NoSQL database systems with their pros and cons - Experience working with databases like MongoDB. - Solid Understanding of MVC and stateless APIs & building RESTful APIs - Should have experience and knowledge of scaling and security considerations - Integration of user-facing elements developed by front-end developers with server-side logic - Good experience with ExpressJs, MongoDB, AWS S3 and ES6 - Writing reusable, testable, and efficient code - Design and implementation of low-latency, high-availability, and performance applications - Implementation of security and data protection - Integration of data storage solutions and Database structure - Good experience in Nextjs, Microservices, RabbitMQ, Sockets Experience: 5-8 years Job Type: Full-time Pay: ₹186,545.16 - ₹992,440.36 per year Experience: Node.js: 5 years (Required) Location: Mohali, Punjab (Required) Work Location: In person

Posted 3 days ago

Apply

7.0 - 4.0 years

12 - 18 Lacs

Pune, Maharashtra

Remote

Job Title: Senior MERN Stack Developer Location: Pune (Onsite) Experience Required: 7+ Years Availability: Immediate Joiners Only (Notice period candidates will not be considered) About the Role: We are seeking an experienced and highly skilled Senior MERN Stack Developer to join our dynamic technology team. The ideal candidate must possess deep technical expertise in React.js and Node.js , with additional knowledge of Python being a strong advantage. You will be responsible for leading and driving the development of scalable web applications, ensuring high performance, and collaborating closely with cross-functional teams. Key Responsibilities: Design, develop, and maintain modern web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js) . Write reusable, testable, and efficient code following best practices and design patterns. Collaborate with product managers, UI/UX designers, and backend developers to deliver robust and user-friendly features. Optimize applications for maximum speed and scalability. Perform code reviews and mentor junior developers in the team. Work with RESTful APIs and third-party integrations. Utilize knowledge of Python in data processing, API development, or backend enhancements as needed. Troubleshoot and debug production issues quickly and effectively. Maintain documentation related to architecture, processes, and systems. Stay up to date with the latest trends and technologies in full-stack development. Required Skills & Qualifications: Minimum 7 years of experience in full-stack development, primarily in MERN stack . Strong proficiency in React.js and Node.js . Experience with MongoDB or other NoSQL databases. Hands-on experience with Express.js and creating RESTful APIs. Solid understanding of front-end technologies such as HTML5, CSS3, JavaScript (ES6+), and TypeScript. Exposure to Python or interest in integrating Python-based components into MERN applications. Knowledge of version control systems (Git). Ability to write clean, scalable, and maintainable code. Strong problem-solving, analytical, and debugging skills. Excellent communication and teamwork skills. Nice-to-Have (Preferred): Experience with DevOps tools, CI/CD pipelines, or Docker. Familiarity with cloud platforms such as AWS, Azure, or GCP. Basic understanding of microservices architecture. Why Join Us? Opportunity to work on cutting-edge technology and large-scale applications. Collaborative work environment with a focus on continuous learning. Competitive compensation and growth opportunities. Exposure to diverse projects and clients across industries. Important Note: Only candidates who are currently based in Pune and can join immediately will be considered. No relocation, notice period, or remote work options are available for this role. If you meet the above criteria and are ready to make an impact, apply now or refer someone who fits the bill! Apply Here: dimple.patel@neosofttech.com Job Types: Full-time, Permanent Pay: ₹1,200,000.00 - ₹1,800,000.00 per year Benefits: Health insurance Paid sick time Provident Fund Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: React: 7 years (Required) Node.js: 7 years (Required) Python: 4 years (Required) Willingness to travel: 100% (Required) Work Location: In person

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Requirements: In-depth knowledge of Python Understanding of Django/Flask, Pandas Familiarity with AWS Environment (EC2, S3, IAM, Athena) Working knowledge of NoSQL databases such as MongoDB Proficiency in consuming and developing REST APIs with JSON data Ability to perform data mining and data exploration with an intuitive sense for problem solving and a strong desire for craftsmanship Specific Job Knowledge, Skills & Abilities: Real-world experience with large-scale data on AWS or a similar platform Must be a self-starter and an effective data wrangler Intellectual curiosity and a strong desire to learn new Big Data and Machine Learning technologies Deadline driven, and capable of delivering projects on time in a fast paced, high-growth environment Willingness to work with unstructured and messy data Bachelor’s degree or Masters's degree in relevant quantitative fields (e.g. Computer Science, Statistics, Electrical Engineering, Applied Mathematics, etc)

Posted 3 days ago

Apply

11.0 years

0 Lacs

India

Remote

Job Title: Python Developer Experience Required: 11 + Years Location: Remote Employment Type: Full-Time Key Skills: Strong experience in Python, MongoDB. Hands on experience developing code in Python, and MongoDB. Have experience in Identity Management, API Management, Security, Tokenization and Microservices domains. Experience in following languages: – Django, Shell Scripting, Python, Maven, ReactJS, Redux, Hooks, Storybook, Jquery, Typescript, HTML5, CSS3 Must have experience in following – Docker, AWS (EC2, S3, RDB, LB), Microservices, Elastic Search, Python, AWS Lambda Function, Serverless Architecture CI/CD tools Jenkins, Ansible SCM tools SVN, Git Data formats JSON, REST, SOAP, WSDL, XML, XPath, XSLT Cloud technologies AWS DevOps Responsibilities: Abilities in handling multiple priorities with ease and adaptable to any kind of environment. Worked under agile methodology, quick learner, willingness to adapt to new challenges and new technologies. Ability to design database. Experience in PostgreSQL and MongoDB is a must. Experience in Rest API’s Experience in both relational and non-relational databases. Experience in Linux is preferred. Must have experience in Lambda and server less architecture. Must have good communication skills.

Posted 3 days ago

Apply

2.0 years

0 - 0 Lacs

Kankarbagh, Patna, Bihar

On-site

Job Description: Full Stack Developer (MERN Stack + Next.js) Location: Patna, Bihar Company: CodeQuery Experience: Minimum 2 Years Salary: ₹25,000 – ₹30,000 per month Job Type: Full-Time | On-site About the Role CodeQuery is looking for a passionate and skilled Full Stack Developer with hands-on experience in MERN Stack and Next.js . If you're a problem solver with a strong development mindset and love building scalable web applications, we'd love to hear from you! Key Responsibilities Design, develop, and maintain scalable and secure web applications using MongoDB, Express.js, React.js, Node.js , and Next.js . Knowledge of Typescript Collaborate with UI/UX designers and product managers to translate designs and wireframes into high-quality code. Write clean, reusable, and efficient code while following best practices. Optimize applications for speed, scalability, and responsiveness. Integrate RESTful APIs and third-party services. Conduct unit and integration testing to ensure the quality and functionality of the application. Participate in code reviews and team meetings to improve overall software design and architecture. Troubleshoot and debug applications for performance issues and bugs. Stay up-to-date with new technologies and trends in full-stack development. Requirements Minimum 2 years of hands-on experience as a Full Stack Developer. Strong proficiency in MERN Stack (MongoDB, Express.js, React.js, Node.js). Solid experience with Next.js for server-side rendering and static site generation. Good understanding of front-end technologies, including HTML5, CSS3, JavaScript, Tailwind and responsive design principles. Familiarity with Git and version control workflows. Strong problem-solving skills and ability to work in a team environment. Excellent communication and time management skills. Perks & Benefits Opportunity to work on real-time projects with impact. Supportive team and learning environment. Performance-based growth opportunities. Job Types: Full-time, Permanent Pay: ₹25,000.00 - ₹30,000.00 per month Work Location: In person

Posted 3 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About Us: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. Job Summary: Build systems for collection & transformation of complex data sets for use in production systems Collaborate with engineers on building & maintaining back-end services Implement data schema and data management improvements for scale and performance Provide insights into key performance indicators for the product and customer usage Serve as team's authority on data infrastructure, privacy controls and data security Collaborate with appropriate stakeholders to understand user requirements Support efforts for continuous improvement, metrics and test automation Maintain operations of live service as issues arise on a rotational, on-call basis Verify whether data architecture meets security and compliance requirements and expectations .Should be able to fast learn and quickly adapt at rapid pace. java/scala, SQL, Minimum Qualifications: Bachelor's degree in computer science, computer engineering or a related field, or equivalent experience 3+ years of progressive experience demonstrating strong architecture, programming and engineering skills. Firm grasp of data structures, algorithms with fluency in programming languages like Java, Python, Scala. Strong SQL language and should be able to write complex queries. Strong Airflow like orchestration tools. Demonstrated ability to lead, partner, and collaborate cross functionally across many engineering organizations Experience with streaming technologies such as Apache Spark, Kafka, Flink. Backend experience including Apache Cassandra, MongoDB and relational databases such as Oracle, PostgreSQL AWS/GCP solid hands on with 4+ years of experience. Strong communication and soft skills. Knowledge and/or experience with containerized environments, Kubernetes, docker. Experience in implementing and maintained highly scalable micro services in Rest, Spring Boot, GRPC. Appetite for trying new things and building rapid POCs" Key Responsibilities : Design, develop, and maintain scalable data pipelines to support data ingestion, processing, and storage Implement data integration solutions to consolidate data from multiple sources into a centralized data warehouse or data lake Collaborate with data scientists and analysts to understand data requirements and translate them into technical specifications Ensure data quality and integrity by implementing robust data validation and cleansing processes Optimize data pipelines for performance, scalability, and reliability. Develop and maintain ETL (Extract, Transform, Load) processes using tools such as Apache Spark, Apache NiFi, or similar technologies .Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal downtimeImplement best practices for data management, security, and complianceDocument data engineering processes, workflows, and technical specificationsStay up-to-date with industry trends and emerging technologies in data engineering and big data. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 25 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 3 days ago

Apply

1.0 - 3.0 years

0 Lacs

India

On-site

We’re looking for a hands-on, product-minded full-stack developer with a strong interest in AI and automation . This role is ideal for someone who loves to build, experiment, and bring ideas to life — fast. You'll work closely with the founding team to prototype AI-powered tools and products from scratch.This is a highly AI-focused role where you will build tools powered by LLMs, workflow automation, and real-time data intelligence — not just build web apps, but create AI-first products . Location - Kochi, Bangalore | Years of experience - 1-3 Years Hire22. ai connects top talent with executive role s anonymously and confidential ly, transforming hiring through a n AI-first, instant CoNCT mod el. Companies ge t interview-ready candidates in just 22 hours . No telecalling, no spam, no manual filtering. Responsibilities Build and experiment with AI-first features powered by LLMs, embeddings, vector databases, and prompt-based workflows Fine-tune or adapt AI/ML models for specific use cases such as job matching, summarization, scoring, and classification Integrate and orchestrate AI capabilities using tools like Vertex AI, LangChain, Cursor, n8n, Flowise, etc. Work with vector databases and implement retrieval-augmented generation (RAG) patterns to build intelligent, context-aware AI applications. Design, build, and maintain full-stack web applications using Next.js and Python as supporting layers around core AI functionality Rapidly prototype ideas, test hypotheses, and iterate fast based on feedback Collaborate with product, design, and founders to transform internal ideas into deployable, AI-powered tools Building internal AI agents, assistants, or copilots Building tools for automated decision-making, resume/job matching, or workflow automation Skills Full-Stack Proficiency: Strong command of JavaScript/TypeScript with experience in modern frameworks like React or Next.js. Back-end experience with Python (FastAPI), orGo. Database Fluent: Comfortable working with both SQL (MySQL) and NoSQL databases (MongoDB, Redis), with good data modeling instincts. AI/ML First Mindset: Hands-on with integrating and optimizing AI models using frameworks like OpenAI, Hugging Face, LangChain, or TensorFlow. You understand LLM architecture, prompt engineering, embeddings, and AI orchestration tools. You’ve ideally built or experimented with AI-driven applications beyond just using APIs.. Builder Mentality: Passionate about product thinking and going from zero to one. You take ownership, work independently, and execute quickly without waiting for perfect clarity. Problem Solver: You break down complex problems, learn fast, and deliver clean, efficient solutions. You value both speed and quality. Communicator & Collaborator: You express your ideas clearly, ask good questions, and keep teams in sync by sharing progress and blockers openly.

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Caprae Capital Partners is an innovative private equity firm led by the principal Kevin Hong who has been a serial tech entrepreneur, and who grew two startups to $31M ARR and $7M in revenue. The fund originated with two additional tech entrepreneur friends of Kevin who have had ~8 figure and ~9 figure exits to Twitter and Square, respectively. Additional partners include an Ex-Nasa software engineer and an Ex-Chief of Staff from Google. Caprae Capital in conjunction with its portfolio company launched AI-RaaS (AI Readiness as a Service) and is looking for teammates to join for the long haul If you have a passion for disrupting the finance industry and happen to be a mission-driven person, this is a great fit for you. Additionally, given the recent expansion of this particular firm, you will have the opportunity to work from the ground level and take on a leadership role for the internship program which would result in a paid role. Lastly, this is also a great role for those who are looking into strategy and consulting roles in the future as it will give you the exposure and experience necessary to develop strong business acumen. Role Overview We are looking for a Lead Full Stack Developer to architect and lead the development of new features for SaaSquatchLeads.com, an AI-driven lead generation and sales intelligence platform. You will own technical direction, guide other engineers, and ensure our stack is scalable, maintainable, and optimized for AI-powered workloads. Key Responsibilities Lead architectural design and technical strategy for SaaSquatchLeads.com. Develop, deploy, and maintain end-to-end features spanning frontend, backend, and AI integrations. Implement and optimize AI-driven services for lead scoring, personalization, and predictive analytics. Build and maintain data pipelines for ingesting, processing, and analyzing large datasets. Mentor and guide a distributed engineering team, setting best coding practices . Collaborate with product, design, and data science teams to align technical execution with business goals. Ensure security, performance, and scalability of the platform. Required Skills & Technologies Frontend: React, JavaScript (ES6+), TypeScript, Redux/Zustand, HTML, CSS, TailwindCSS. Backend: Python (Flask, FastAPI, Django), Node.js (bonus). AI & Data Science: Python, PyTorch, Hugging Face, OpenAI APIs, LangChain, Pandas, NumPy. Databases: PostgreSQL, MySQL, MongoDB, Redis. DevOps & Infrastructure: Docker, Kubernetes, AWS (Lambda, S3, RDS, EC2), CI/CD pipelines. Data Processing: ETL tools, message queues (Kafka, RabbitMQ). Search & Indexing: Elasticsearch, Meilisearch (for fast lead lookups).

Posted 3 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 4 days ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM

Posted 4 days ago

Apply

0.0 - 9.0 years

0 Lacs

Hyderabad, Telangana

On-site

General information Country India State Telangana City Hyderabad Job ID 45479 Department Development Description & Requirements Senior Java Developer is responsible for architecting and developing advanced Java solutions. This role involves leading the design and implementation of microservice architectures with Spring Boot, optimizing services for performance and scalability, and ensuring code quality. The Senior Developer will also mentor junior developers and collaborate closely with cross-functional teams to deliver comprehensive technical solutions. Essential Duties: Lead the development of scalable, robust, and secure Java components and services. Architect and optimize microservice solutions using Spring Boot. Translate customer requirements into comprehensive technical solutions. Conduct code reviews and maintain high code quality standards. Optimize and scale microservices for performance and reliability. Collaborate effectively with cross-functional teams to innovate and develop solutions. Experience in leading projects and mentoring engineers in best practices and innovative solutions. Coordinate with customer and client-facing teams for effective solution delivery. Basic Qualifications: Bachelor’s degree in Computer Science or a related field. 7-9 years of experience in Java development. Expertise in designing and implementing Microservices with Spring Boot. Extensive experience in applying design patterns, system design principles, and expertise in event-driven and domain-driven design methodologies. Extensive experience with multithreading, asynchronous and defensive programming. Proficiency in MongoDB, SQL databases, and S3 data storage. Experience with Kafka, Kubernetes, AWS services & AWS SDK. Hands-on experience with Apache Spark. Strong knowledge of Linux, Git, and Docker. Familiarity with Agile methodologies and tools like Jira and Confluence. Excellent communication and leadership skills. Preferred Qualifications Experience with Spark using Spring Boot. Familiarity with the C4 Software Architecture Model. Experience using tools like Lucidchart for architecture and flow diagrams. About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies