Jobs
Interviews

20630 Mongodb Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Summary: Strong Full Stack developer with 3-5 years of experience & proficiency with .NET Core (ASP.NET Core), SQL (PostgreSQL/SQL Server), NoSQL (MongoDB/CosmosDB), and Restful web services. Experience in building on web technologies and frameworks (React.js / Next.js / TypeScript ) Commitment to collaborative problem-solving, sophisticated design, and quality products is important. Optimizing services for maximum performance Thorough code review and suggest refactoring for better design/performance Knowledge of Cloud platforms like AWS or Azure Familiarity with code versioning tools viz., Git Key Responsibilities and Duties: Collaborate with the team to design and implement the architecture of applications. Collaborate with the team to design and integrate user-facing elements with server-side logic Skills: .NET Core (ASP.NET Core), C# React.js / Next.js Redux Typescript SQL/NoSQL Flask API / Fast API CI/CD, Docker GIT AWS/Azure Joining: We must fill this position urgently. Candidates who can start within a week will be our priority. Pay range: Upto INR 12-18LPA depending on experience

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About the Role We’re looking for a Senior Backend Engineer (SDE III) who can architect and build robust backend systems while also managing infrastructure and deployments. This is a hands-on role with full ownership—from API design and database performance to cloud infrastructure and CI/CD automation. You’ll collaborate across product, design, and frontend teams, while also mentoring junior developers and driving best practices. Roles & Responsibilities Design, develop, and maintain scalable backend services using a modern framework of your choice. Build well-structured APIs (REST or GraphQL) with robust authentication, authorization, and versioning. Define and evolve database schemas; optimize queries for performance and reliability. Use NoSQL databases (where required) for high-throughput or flexible data needs. Own infrastructure setup and manage deployments on cloud platforms—there is no separate DevOps team. Automate CI/CD workflows, containerize services using Docker, and maintain deployment pipelines. Ensure system performance, resilience, and observability through caching, queuing, and monitoring. Implement secure coding practices including data encryption, access controls, and input validation. Debug and troubleshoot issues across the stack—from database to API layer to production. Collaborate with cross-functional teams to define integration contracts and delivery timelines. Mentor and guide junior engineers, participate in code reviews, and lead architecture discussions. Required Skills & Experience 5-8 yrs of Strong hands-on experience with any modern backend framework (e.g., Node.js / RoR / Python Django / Spring Boot, etc.). Proficiency in working with relational databases like PostgreSQL or MySQL—schema design, joins, and indexing. Experience with NoSQL databases (e.g., MongoDB, Redis) where applicable to the system design. Strong understanding of API design principles, security (OAuth2, JWT), and error handling strategies. Hands-on experience with cloud infrastructure (AWS /GCP/ Azure) and managing production environments. Proficient in containerization (Docker) and deployment automation using CI/CD pipelines. Experience with background processing, message queues, or event-driven systems. Familiarity with monitoring, logging, and alerting tools to ensure system health and reliability. Understanding of infrastructure management practices—basic scripting, access control, and environment setup. Understanding of the how different frontend / mobile components work and willingness to explore & work in them if required Ability to independently take features from concept to deployment with a focus on reliability and scalability. Experience mentoring developers and contributing to high-level technical decisions. Why Join Us Be a key engineering leader in a product-first company. Take end-to-end ownership—from code to cloud. Work on real-world, scalable systems across diverse domains. Collaborate with a smart, passionate team based in Chennai. Help shape the architecture and engineering culture from the ground up.

Posted 3 days ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior Manager – Technology Job Type: Full-Time Firstsource is a leading innovator in the technology sector, dedicated to delivering cutting-edge solutions and services. We are seeking a highly skilled and experienced Senior Manager – Technology to join our dynamic team. Job Summary: The Senior Manager – Technology will be responsible for overseeing the implementation of contact center solutions, developing and integrating Generative AI (GenAI) technologies, and writing code modules. The ideal candidate will have a strong background in Python and a proven track record in managing technology projects. Our ideal candidates should be familiar with the software development life cycle (SDLC) from preliminary system analysis to tests and deployment. To build high-quality, innovative, and fully performing software that complies with coding standards and technical design Required Skills Proven experience in implementing contact center solutions. Experience in any one contact center application (Genesys, Five9, Cisco , Avaya , Aws Connect, Twilio etc) Strong troubleshooting skills. Applicant must be able to determine the causes of complex problems Strong proficiency in Python programming. Experience with Generative AI (GenAI) development and integration. Knowledge of TensorFlow, Keras, PyTorch, or Scikit-learn for developing machine learning models. Familiarity with SQL and NoSQL databases, such as PostgreSQL, MySQL, MongoDB, or Redis. Experience with cloud platforms like AWS, Azure, or Google Cloud Expertise in database design, technical documentation Excellent problem-solving and analytical skills. Strong leadership and team management abilities. Effective communication and interpersonal skills. Ability to work in a fast-paced, dynamic environment. Knowledge of software development best practices and methodologies. Knowledge of DevOps practices and tools like Docker, Kubernetes, Jenkins, and Git. Roles And Responsibilities Lead the implementation of contact center solutions, ensuring seamless integration and optimal performance. Develop and integrate Generative AI (GenAI) technologies to enhance our product offerings. Write, review, and maintain code modules, ensuring high-quality and efficient code. Collaborate with cross-functional teams to define project requirements, scope, and deliverables. Manage and mentor a team of developers, providing guidance and support to achieve project goals. Stay updated with the latest industry trends and technologies to drive innovation within the team. Ensure compliance with best practices in software development, security, and data privacy. Execute full software development life cycle (SDLC) Troubleshoot and resolve technical issues in a timely manner. Qualifications Information Technology, or a related field. Minimum of 12+ years of experience in technology management or a similar role. Demonstrated experience in managing technology projects from inception to completion. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

5.0 years

0 Lacs

South Delhi, Delhi, India

On-site

📍 Location: Greater Kailash II, South Delhi, New Delhi (On-Site Role Only | Local candidates preferred) 🕹️ About Us: We’re a stealth-mode, bootstrapped startup building a next-gen SaaS platform at the convergence of Gaming and Ad-Tech . We believe in lean, fast, and purposeful product development. With strong early traction and a focused vision, we are building for scale and sustainability from day one. As we scale our engineering efforts, we’re hiring a Software Architect who can define, design, and drive the technical foundation of our platform — while working hands-on with a full-stack TypeScript team (React, Next.js, NestJS, MongoDB, Deno). 🚀 Role Overview: You will be the technical backbone and system thinker in the product team — responsible for designing scalable system architecture, making long-term technology choices, and guiding implementation with a “less is more” mindset. This is a high-ownership role reporting directly to the founders. 🎯 Key Responsibilities: Architect and evolve the overall system — frontend, backend, APIs, data flows, security, and DevOps. Make foundational decisions on tech stack, system design, infrastructure, and scalability. Lead the packaging and delivery of the SaaS product — with clear attention to versioning, deployment models (single/multi-tenant), and product modularity. Drive design of core product modules, services, and integrations. Balance lean MVP delivery with long-term maintainability and performance. Set up engineering best practices: CI/CD pipelines, versioning, logging, monitoring, testing frameworks. Collaborate with the product manager and developers to ensure architecture aligns with business goals. Mentor developers, perform architecture/code reviews, and raise engineering standards. Lead choices around data modeling, caching, message queues, and event-driven design (if required). 🧠 Required Skills & Experience: 5+ years of experience in software architecture or senior engineering roles. Strong expertise in TypeScript across the stack. Hands-on experience with: React , Next.js (frontend, SSR, routing) NestJS or ExpressJS on Deno runtime MongoDB , schema design, optimization Proven experience in packaging and delivering a SaaS product — with focus on deployment, scalability, and customer onboarding. Deep understanding of system design, REST APIs, microservices, and API security. Experience deploying scalable web apps on platforms like Vercel , Render , or containerized setups (Docker/K8s optional). Familiarity with CI/CD pipelines , GitHub workflows, and DevOps fundamentals. Excellent documentation, architectural diagramming, and communication skills. ✅ You’re a Fit If You: Are a system thinker with a passion for elegant, efficient architecture. Have successfully packaged, deployed, and versioned SaaS applications in production. Believe in lean product development — building only what’s essential and scalable. Can translate fuzzy product ideas into well-structured technical designs. Thrive in early-stage, ambiguous, and high-ownership environments. Enjoy mentoring and leading small but high-performing technical teams. 🌱 Why Join Us: Build a modern product from scratch with real ownership. Work on a unique problem at the intersection of gaming, community, and advertising . Join a close-knit, ambitious founding team. Opportunity to grow into CTO / Head of Engineering as the company scales. During the interview process, wherever asked, quote Flight No. AI 101 for ready reference. 📢 Ready to architect something exceptional from the ground up? Apply now and let’s build the future of community-first Ad-Tech in gaming.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Introduction Software Developer plays a crucial role in designing, implementing, and maintaining efficient and reliable software systems. This role requires a solid understanding of programming languages, software engineering principles, and database concepts. The ideal candidate will have excellent problem-solving abilities and communication skills. Your Role And Responsibilities Collaborate with the tech team to understand project requirements and specifications. Write clean, well-organized, and efficient code using modern programming languages and frameworks. Conduct unit tests, debugging, and troubleshooting to ensure the software functions as intended. Participate in code reviews to improve code quality, consistency, and adherence to coding standards. Stay updated on emerging trends and technologies in the software development industry. Communicate progress, issues, and suggestions to the tech team and stakeholders. Adhere to project timelines, deadlines, and budgets. Provide assistance to junior developers and contribute to knowledge sharing sessions. Required Technical And Professional Expertise Hands-on experience in object-oriented programming languages such as Java, C++, or Python. Knowledge of web development frameworks like React, Angular, or Vue.js. Knowledge of databases, SQL syntax, and NoSQL databases (e.g., MongoDB, Postgres, etc.). Excellent problem-solving and analytical skills. Strong verbal and written communication skills. Ability to work independently and collaboratively in a fast-paced agile environment. B.E/B.Tech/M.C.A/M.Tech in Computer Science or equivalent practical experience. Preferred Technical And Professional Experience Understanding of version control systems (e.g., Git, etc.) and continuous integration/continuous deployment (CI/CD) pipelines

Posted 3 days ago

Apply

2.0 years

0 Lacs

India

On-site

We’re building the next-generation communications analytics and automation platform—one that fuses deep telemetry, enterprise-scale voice/calling data, and AI-driven remediation. As a Senior Backend Engineer , you'll play a core role in designing the resilient, scalable backend of a high-visibility platform that already drives action across global Microsoft Teams deployments. This isn’t a maintenance gig. This is architecture, orchestration, and ownership. You’ll help design microservices, implement scalable APIs, and ensure data flows seamlessly from complex real-time systems (like call quality diagnostics and device telemetry) into actionable intelligence and automation pipelines. If you’re excited by backend systems with real-world impact—and want to transition into intelligent agentic systems powered by GenAI—this role is built for you. What You'll Work On Platform Engineering (Core Backend) Design and implement robust, cloud-native services using modern backend stacks (Node.js, Python, .NET Core, or similar). Build scalable APIs to surface data and actions across TeamsCoreIQ modules (call analytics, device insights, policy management, AI-based RCA). Integrate with Microsoft Graph APIs and Teams Calling infrastructure (Auto Attendants, Call Queues, Call Quality, Presence, Policies). Develop event-driven workflows using queues (Service Bus, Kafka, RabbitMQ) for high-throughput ingestion and action pipelines. Work with real-time data stores, telemetry ingestion, and time-series analytics backends (PostgreSQL, MongoDB, InfluxDB, or equivalent). Infrastructure & DevOps Support Help scale and secure workloads using Azure, Kubernetes, and CI/CD pipelines (GitHub Actions, Azure DevOps). Implement observability practices—logging, metrics, alerting—for zero-downtime insights and RCA. Future-Forward (Agentic Track) Support the evolution of the backend toward intelligent agent orchestration: Build services that allow modular “agents” to retrieve, infer, and act (e.g. provisioning, remediation, escalation). Explore interfaces for integrating OpenAI, Azure AI, or RAG pipelines to make automation contextual and proactive. What You Bring Must-Have Technical Skills 2+ years backend engineering experience with production-grade systems. Strong proficiency in at least one modern backend language (Node.js, Python, Go, or .NET Core). Deep understanding of RESTful API design, GraphQL is a bonus. Experience building cloud-native apps on Azure (preferred), AWS or GCP. Familiarity with Microsoft ecosystem: Graph API, Teams, Entra ID (AAD), SIP/VoIP call data a big plus. Experience with relational and NoSQL databases; data modeling and performance tuning. Bonus (Not Mandatory, but Highly Valued) Exposure to AI/ML pipelines, LangChain, OpenAI API, or vector databases (Pinecone, Weaviate). Background in observability, root-cause analysis systems, or voice analytics. Experience with policy engines, RBAC, and multi-tenant SaaS platforms. Traits We Love Systems Thinker – You optimize for scale and understand how backend services interact across a distributed system. Builder’s DNA – You love to own, refine, and ship high-quality features fast. Learning Velocity – You’re interested in agentic architectures, GenAI, and eager to transition toward intelligent orchestration. Code Ethic – You write clean, maintainable, testable code—and always think security-first. Performance Expectations (First 30 Days) ü Ship a core modules with full test coverage and observability. ü Deliver API endpoints for at least one major module (e.g. RCA, Call Analytics, DeviceIQ). ü Draft and refine at least one reusable internal service that improves time-to-market for future agents. ü Collaborate with frontend, DevOps, and AI teams to support rapid iteration and experimentation. Tips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

What You’ll Do "Eaton Corporation’s Device Integration has an opening for a Software Engineer who is passionate about his or her craft. He/She will be directly involved in the architecture, design, and development of an Internet of Things application. Although Eaton is an established company with a diverse product portfolio, this product exhibits many characteristics of a start-up initiative. This product, being developed from the ground-up, will be utilized by Eaton product teams as a framework on which to build and extend customer facing applications and solutions. So, if you’re an experienced software professional yearning to work on a project leveraging the latest software technologies and trends such as IoT, NoSQL, big data, open source, DevOps, mobile, and cybersecurity, this is the position for you! Not only will you be working with some amazing technology, you’ll also be part of an enthusiastic team of software professionals working to make an immediate organizational impact and having lots of fun along the way!" " Work with your team and others, contributing to the architecture, design, and implementation of an Internet of Things application. Development will be primarily in C#, and .NET Author high-quality, unit-tested code. Demonstrate and document solutions by using flowcharts, diagrams, code comments, code snippets, and performance instruments. Provide work estimates and participate in design, implementation, and code reviews. Execute agile work plans for iterative and incremental project delivery. Expand job knowledge by studying software development techniques and programming languages. Participate in educational opportunities and read professional publications. Work with test teams to ensure adequate and appropriate test case coverage; investigate and fix bugs; create automated test scripts." Qualifications BE/B.Tech/M.Tech/MCA " 6-9 years of progressive experience in software industry developing, designing, and deploying technology solutions shipping high quality products 5-7 yrs of experience on C# and .Net 5+ yrs of experience working with Azure" Skills " Proficient with C# and .Net Technologies and associated IDE’s (Visual Studio, Eclipse, IntelliJ, etc.) Understanding of Databases and concepts (relational and non-relational like sqlserver, cosmos, mongodb etc.) Understanding of software design principles, algorithms, data structures, and multithreading concepts Understanding of object oriented design and programming skills, including the use of design patterns Working knowledge of cloud development platforms such as Azure or AWS Working knowledge of security concepts such as encryption, certificates, and key management Working knowledge of networking protocols and concepts (http, tcp, websocket) Working knowledge of network and distributed computing concepts Experience utilizing best practices in software engineering Experience with Agile development methodologies and concepts Strong problem solving and software debugging skills Knowledge of CI/CD concepts, tools, and technologies Knowledge of field bus protocols like Modbus TCP/RTU will be an added advantage. " " Excellent verbal and written communication skills including the ability to effectively explain technical concepts Very good at understanding and prioritizing tasks, issues and work "

Posted 3 days ago

Apply

10.0 years

0 Lacs

Delhi, India

On-site

Company Size Mid-Sized Experience Required 10 - 15 years Working Days 5 days/week Office Location Delhi Role & Responsibilities Lead and mentor a team of data engineers, ensuring high performance and career growth. Architect and optimize scalable data infrastructure, ensuring high availability and reliability. Drive the development and implementation of data governance frameworks and best practices. Work closely with cross-functional teams to define and execute a data roadmap. Optimize data processing workflows for performance and cost efficiency. Ensure data security, compliance, and quality across all data platforms. Foster a culture of innovation and technical excellence within the data team. Ideal Candidate 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics. Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy and align it with business objectives. Strong leadership, communication, and stakeholder management skills. Preferred Qualifications Experience in machine learning infrastructure or MLOps is a plus. Exposure to real-time data processing and analytics. Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company. Perks, Benefits and Work Culture Testimonial from a designer: 'One of the things I love about the design team at Wingify is the fact that every designer has a style which is unique to them. The second best thing is non-compliance to pre-existing rules for new products. So I just don't follow guidelines, I help create them.' Skills: infrastructure,soc2,ansible,drive,data governance,redshift,gdpr,javascript,cassandra,design,spring boot,jenkins,docker,mongodb,java,tidb,elk,python,php,aws,snowflake,lld,chef,bigquery,gcp,golang,html,data,kafka,grafana,kubernetes,scala,css,hadoop,azure,redis,sql,data processing,spark,hld,node.js,google guice,compliance

Posted 3 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Orbital We’re Orbital — one of the fastest-growing sales AI startups. Based in New York City and Hyderabad, we’re building cutting-edge AI for go-to-market teams. With $4.5M in funding from Silicon Valley investors, Orbital is a platform that helps sales teams sell 10x faster to SMBs using AI agents. Today, we’re trusted by Fortune 100 companies and startups alike. Orbital was originally founded in 2022 by Ani Kunaparaju and Riley Soward, who split their time between New York City and Hyderabad. We have a fun, in-person culture, and a rapidly growing team around the world. About the Role We’re looking for a Senior Software Engineer to help design, build, and scale our AI-powered platform. As a core member of our engineering team, you’ll work directly with our founders and have a major impact on the architecture, performance, and reliability of Orbital. This is a high-impact, hands-on role where you'll shape the future of AI-driven sales while working with cutting-edge AI/ML technologies. What You’ll Do Architect & Develop: Design, implement, and optimize scalable backend systems and AI-driven workflows. AI & Automation: Work with AI models and no-code/low-code automation to enhance sales intelligence. Scalability & Performance: Optimize our platform for high availability, low latency, and seamless integrations. Code & Best Practices: Write clean, maintainable, and efficient code while following engineering best practices. Collaboration: Work closely with founders, product managers, and customers to ship impactful features. Mentorship: Guide junior engineers, contribute to code reviews, and help elevate the technical culture. What We’re Looking For 4+ years of experience in software engineering, preferably in fast-paced startups or high-growth tech companies. Expertise in backend development using Python, Node.js, or similar languages. Strong database skills (PostgreSQL, MongoDB, or similar). Experience with cloud platforms (AWS, GCP, or Azure). Problem-Solving Mindset: Ability to navigate ambiguity and develop innovative solutions. Hustle & Ownership: You thrive in a fast-paced environment and take full ownership of your work. Bonus Points for experience in AI & LLM agentic frameworks or building low-code platforms. Additional Perks Fast-paced, high-energy startup environment Free meals and drinks at the office Top-of-the-line health insurance Opportunity to work with cutting-edge AI technologies If you’re excited about building AI-powered products, working with a world-class team, and making an impact in a high-growth startup, we’d love to hear from you! 🚀

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Java Full Stack Developer Exp: 5+ Years Mandate Skill: Spring Boot for backend development and be proficient in ReactJS for front-end development Required Skills ü Backend: Java, Spring Boot, Microservices, REST APIs, JPA/Hibernate ü Frontend: ReactJS, JavaScript, TypeScript, Redux ü Database: PostgreSQL, MySQL, MongoDB ü Cloud & DevOps: Docker, Kubernetes, CI/CD, GitHub Actions or Jenkins ü Messaging & Caching: Kafka, Redis ü Agile Practices: Jira, Confluence, Scrum Salary: Max 2000000 LPA We are looking for a mid-level full stack developer with a strong backend focus to join our team. The ideal candidate should have hands-on experience in Spring Boot for backend development and be proficient in ReactJS for front-end development . The candidate will be responsible for developing, enhancing, and maintaining enterprise applications while working in an Agile environment. Key Responsibilities Backend Development: Design, develop, and maintain RESTful APIs using Spring Boot and Java. Implement microservices architecture and ensure high-performance applications. Work with relational and NoSQL databases, optimizing queries and performance. Integrate with third-party APIs and messaging queues (Kafka, RabbitMQ). Frontend Development: Build and maintain user interfaces using ReactJS and modern UI frameworks. Ensure seamless API integration between front-end and back-end systems. Implement reusable components and optimize front-end performance. DevOps & Deployment: Work with Docker and Kubernetes for application deployment. Ensure CI/CD pipeline integration and automation. Collaboration & Agile Process: Work closely with onshore and offshore teams in a POD-based delivery model. Participate in daily stand-ups, sprint planning, and retrospectives. Write clean, maintainable, and well-documented code following best practices. Preferred Qualifications Prior experience working on Albertsons projects is a huge plus. Familiarity with Google Cloud Platform (GCP) or any cloud platform. Exposure to monitoring tools like Prometheus, Grafana. Strong problem-solving skills and ability to work independently.

Posted 3 days ago

Apply

1.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Snapmint Snapmint is a leading fintech company redefining access to consumer credit in India. With over 10 million customers across 2,200+ cities, our zero-cost EMI platform enables responsible purchases without the need for a credit card across categories like fashion, electronics, and lifestyle. India has over 300 million credit-eligible consumers, yet fewer than 35 million actively use credit cards. Snapmint addresses this gap by offering a trusted, transparent alternative grounded in financial inclusion and ethical lending practices. Founded in 2017, Snapmint is a profitable, high-growth company doubling year-on-year. Our founding team, alumni of IIT Bombay and ISB, brings deep experience from companies like Oyo, Ola, Maruti Suzuki, and has successfully built and exited ventures in ad-tech, patent analytics, and bank-tech. We are building the future of responsible consumer finance, simple, transparent, and customer-first. https://snapmint.com/ About the role Snapmint team is looking for a Front Developer SDE I with a passion for working closely with product managers and other developers to implement innovative solutions to challenging web development problems. In this role, you will be responsible for supporting new and ongoing digital projects including corporate and client microsites, and integration with data and marketing platforms. Also, driving automation and ensuring automated test scripts are completed for new features. We are looking for a talented Frontend Engineer that will contribute to the success of Snapmint by providing analysis of problems and recommended solutions and working collaboratively with a tight-knit product development team. Basic Qualification BE/Btech or equivalent 1-3 years of work experience 1-3 years of experience working in React and able to demonstrate strong JavaScript knowledge Proficient in Next.js, TypeScript, and Tailwind CSS for building modern web applications; strong understanding of OOP (Object-Oriented Programming) concept Prior experience working with PostgreSQL and MongoDB is good to have Preferred Qualifications Proven success in communicating with users, other technical teams, and senior Strong experience in systems architecture design and development Strong experience building single-page and progressive web applications Experience using test-driven development practices Ability to efficiently manage and build large, complex web applications Strong analytical and problem-solving skills, with good attention to detail Excellent oral and written communication skills Self-motivated, ability to work independently Ability to use creative thinking to develop innovative solutions to business problems Strong project management skills, including the ability to manage multiple projects simultaneously Location: Gurgaon (Unitech Cyber Park, Sector 39) Work Days: Monday - Friday

Posted 3 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role: Senior Software Engineer Experience Required : 4-6 years Skills: Java, Springboot Location : Sector 16 , Noida Work Mode: 5 days (Work from Office) Interview Mode : Face2Face Notice Period: Immediate/Serving only About Times Internet At Times Internet, we create premium digital products that simplify and enhance the lives of millions. As India’s largest digital products company, we have a significant presence across a wide range of categories, including News, Sports, Fintech, and Enterprise solutions. Our portfolio features market-leading and iconic brands such as TOI, ET, NBT, Cricbuzz, Times Prime, Times Card, Indiatimes, Whatshot, Abound, Willow TV, Techgig and Times Mobile among many more. Each of these products is crafted to enrich your experiences and bring you closer to your interests and aspirations. As an equal opportunity employer, Times Internet strongly promotes inclusivity and diversity. We are proud to have achieved overall gender pay parity in 2018, verified by an independent audit conducted by Aon Hewitt. We are driven by the excitement of new possibilities and are committed to bringing innovative products, ideas, and technologies to help people make the most of every day. Join us and take us to the next level! About the Business Unit: Architecture and Group Initiatives (AGI) AGI owns the world-class Enterprise CMS solutions that empower all digital newsrooms within Times Internet and beyond. The solutions include state-of-the-art authoring tools with AI-enabled generative and assistive features, analytics and reporting tools and services that easily scale to the millions of requests per minute. This unique scaling need and engineering of state-of-the-art products make AGI a place of constant evolution and innovation across product, design and engineering in the ever-growing digital and print media industry landscape. About the role: We seek a highly skilled and experienced Java Senior Software Engineer to join our dynamic team who can play a key role in designing, developing, and maintaining our Internet-based applications. As a Senior Engineer, you have to actively participate in designing and implementing projects with high technical complexity, scalability, and performance implications. You will collaborate with cross-functional teams to deliver high-quality software solutions that meet customer needs and business objectives. Roles and Responsibilities Design, development, and testing of large-scale and high-performance web applications and frameworks. Create reusable frameworks through hands-on development and unit testing. Write clean, efficient, and maintainable code following best practices and coding standards. Troubleshoot and debug issues, and implement solutions on time. Participate in architectural discussions and contribute to the overall technical roadmap. Stay updated on emerging technologies and trends in Java development, and make recommendations for adoption where appropriate. Skills Required: Bachelor's degree in Computer Science, Engineering, or a related field. 4+ years of hands-on experience in Java development, with a strong understanding of core Java concepts and object-oriented programming principles. Proficiency in Spring framework, including Spring Boot, Spring MVC, and Spring Data. Experience with Kafka for building distributed, real-time streaming applications. Strong understanding of relational databases such as MySQL, including schema design and optimization. Proficiency in writing SQL Queries is a must. Experience with NoSQL Databases such as MongoDB, and Redis. Experience with microservices architecture and containerization technologies such as Docker and Kubernetes. Excellent problem-solving skills and attention to detail. Knowledge of software development lifecycle methodologies such as Agile or Scrum. Strong communication and collaboration skills. Ability to work effectively in a fast-paced environment and manage multiple priorities. Self-motivation and the ability to work under minimal supervision.

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Summary: As a Senior Database Developer at OneMagnify, you will play a pivotal role in designing, developing, and maintaining scalable database solutions that support critical business functions. This position requires expertise in database management systems, a collaborative mindset, and a passion for driving technical innovation. You will work closely with cross-functional teams to optimize database performance, implement robust data integration strategies, and ensure the integrity and reliability of our data systems. Core Responsibilities: Design, develop, and maintain scalable, high-performance database solutions that support critical business functions. Optimize database performance and ensure data integrity through proactive monitoring, tuning, and troubleshooting. Collaborate with cross-functional teams, including software engineers, data scientists, and business analysts, to determine database requirements and implement effective solutions. Lead data migration and integration projects to consolidate and transform data across multiple systems. Develop and enforce best practices for database design, data modeling, and query optimization. Implement high-availability and disaster recovery strategies to ensure system reliability. Contribute to architectural decisions involving database structures, scalability, and performance. Provide mentorship and technical guidance to junior developers through code reviews, knowledge sharing, and training sessions. Stay up-to-date with emerging database technologies and evaluate their applicability to our business needs. Requirements: Education & Experience: Bachelor’s degree in Information Technology, Computer Science, or a related field, or equivalent professional experience. Technical Proficiency: Expertise in relational database management systems (SQL Server, Oracle, PostgreSQL, etc.). Proficiency in writing and optimizing complex SQL queries and stored procedures. Familiarity with NoSQL databases (e.g., MongoDB, DynamoDB) and cloud database solutions (e.g., AWS RDS, Azure SQL Database). Soft Skills: Strong problem-solving abilities, attention to detail, and the ability to communicate technical insights to both technical and non-technical stakeholders. Bonus Qualifications: Experience with data security, compliance standards (e.g., GDPR, HIPAA), and disaster recovery. Familiarity with agile development methodologies and DevOps practices. Experience with database performance monitoring and automated testing tools.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Size Large-scale / Global Experience Required 5 - 8 years Working Days 6 days/week Office Location Viman Nagar, Pune Role & Responsibilities Lead scalable, high-performance application architecture. Develop and design enterprise-grade applications. Manage Azure DevOps processes and performance optimization. Conduct solution design, RCA documentation, and interact with cross-functional teams. Ideal Candidate Qualification: Graduation in Computers/Electronics or Post-Graduation in Computer Science. Experience: 5–8 years in software/application development. Mandatory Technical Skills Core Technologies: Python, FastAPI, React/TypeScript, Langchain, LangGraph, AI Agents Docker, Azure Open AI, Prompt Engineering Cloud & Infrastructure: AWS (Secrets Manager, IAM, ECS/EC2), Azure AD, Azure DevOps, GitHub Database & Performance: MongoDB (Motor, Beanie ODM), Redis, caching strategies Security: OAuth2/SAML, JWT, Azure AD integration, audit logging Soft Skills: Strong problem-solving, mentoring, technical communication, Independent contributor with high ownership mindset Perks, Benefits and Work Culture Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India. Skills: redis,motor,mentoring,python,langchain,ai agents,github,react,auditing logging,oauth2,problem-solving,azure,technical communication,azure open ai,langgraph,azure devops,jwt,fastapi,prompt engineering,docker,devops,beanie odm,mongodb,typescript,saml,azure ad,aws

Posted 3 days ago

Apply

0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Apptunix is a leading Mobile App & Web Solutions development agency, based out of Texas, US. The agency empowers cutting-edge startups & enterprise businesses, paving the path for their incremental growth via technology solutions. Established in mid-2013, Apptunix has since then engaged in elevating the client’s interests & satisfaction through rendering improved and innovative Software and Mobile development solutions. The company strongly comprehends business needs and implements them by merging advanced technologies with its seamless creativity. Apptunix currently employs 250+ in-house experts who work closely & dedicatedly with clients to build solutions as per their customers' needs. Apptunix is hiring for NodeJs Team Lead. Hurry Up, Apply. Roles and Responsibilities: Deep Experience working on Node.js Understanding of SQL and NoSQL database systems with their pros and cons Experience working with databases like MongoDB. Solid Understanding of MVC and stateless APIs & building RESTful APIs Should have experience and knowledge of scaling and security considerations Integration of user-facing elements developed by front-end developers with server-side logic Good experience with ExpressJs, MongoDB, AWS S3 and ES6 Writing reusable, testable, and efficient code Design and implementation of low-latency, high-availability, and performance applications Implementation of security and data protection Integration of data storage solutions and Database structure

Posted 3 days ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Full-Stack Developer - I mmediate joiner About the Role: We’re looking for a talented and motivated immediate joiner Full-Stack Developer to hit the ground running on our dynamic development team. You’ll work across the MERN stack (MongoDB, Express.js, React, Node.js) along with modern HTML and CSS to build and maintain responsive, high-quality web applications. Responsibilities : Design, develop, and maintain web applications using the MERN stack Create responsive and visually appealing user interfaces using HTML, CSS, and React Write clean, efficient, and reusable code for both front-end and back-end components Ensure the performance, quality, and responsiveness of applications Identify and correct bottlenecks and fix bugs Help maintain code quality, organization, and automatization Participate in code reviews and contribute to team discussions on architecture and design Requirements: Good communication skills 1 year of professional work experience Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent experience) Strong proficiency in HTML5, CSS3, and modern JavaScript (ES6+) Thorough understanding of React.js and its core principles Proficiency in Node.js and Express.js for server-side development Experience with MongoDB and Mongoose ODM Familiarity with RESTful APIs and modern authorization mechanisms, such as JSON Web Token Experience with modern front-end build pipelines and tools, such as Babel, Webpack, NPM, etc Familiarity with code versioning tools (such as Git) Nice to Have: Experience with GraphQL Familiarity with cloud services (AWS, Google Cloud, or Azure) Key Competencies: Strong problem-solving skills and attention to detail Good communication and teamwork abilities Self-motivated and able to work independently when required Passion for learning and staying updated with the latest web technologies What We Offer: Competitive salary Opportunities for professional growth and learning Collaborative and innovative work environment Chance to work on challenging and impactful projects Flexible work arrangements Join our team and help build robust, scalable, and user-friendly web applications using the latest technologies in full-stack development. Your contributions will be key to delivering high-quality software solutions that not only function flawlessly but also look great and provide an excellent user experience.

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

MongoDB’s mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere—on premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it’s no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. Cloud Operations Engineers are responsible for building internal tools and process automation. Day-to-day duties are creating and monitoring systems alert dashboards, reviewing critical event and system logs, accessing customer instances that underpin their production databases, and performing server administration duties including performance troubleshooting. Applicants must be critical thinkers who are quick to detect, resolve, or escalate issues that are sometimes broad in scope and difficult to trace. We are looking to speak to candidates who are based in Bengaluru for our hybrid working model. Responsibilities Help scale the Cloud Operations Engineering team with the strategic implementation and refinement of processes and tools Provide career development feedback and advice to direct reports Identify and measure team health indicators and performance metrics Ensure proper team focus on priorities, objectives, and related deliverables Collaborate with technical and non-technical teams across the company Balance your time between leading your team, working on customer incidents and being involved in projects Be a source of guidance and advice to your own team members and other teams within MongoDB Build a relationship with your team around trust Successfully coordinate with a global team of Cloud Operations Engineers who are tasked with ensuring our uptime guarantees to the MongoDB Atlas customer base Participate in designing and building internal tools Assist in scoping, designing and deploying systems that reduce Mean Time to Resolve for customer incidents Monitor and detect emerging customer-facing incidents on the Atlas platform; assist in their proactive resolution Automate internal processes, routine monitoring and troubleshooting tasks Diagnose live incidents, differentiate between platform issues versus usage issues, and take the next steps toward resolution Cooperate with our Product Management and Cloud Engineering organizations by identifying areas for improvement in the management applications powering the Atlas infrastructure Coordinate and participate in a weekly on-call rotation, where you will handle short term customer incidents (from direct surveillance or through alerts via our Technical Services Engineers) Requirements Management skills, with hands-on experience running small to mid sized Engineering Teams in a rapid-growth environment Strong diagnostic/troubleshooting process, with significant experience troubleshooting end-to-end technical issues in production environments Experience supervising, leading and monitoring progress of Software Development projects. Patience, empathy, and a genuine desire to help others Excellent communication skills, both written and verbal Ability to think on your feet, remain calm under pressure, and find solutions to challenges in real-time Experience with being an oncall DevOps, SRE, or Cloud Operations engineer Expertise with Linux system administration and networking technologies Knowledge of database and distributed system operations and concepts Knowledgeable about a wide range of web and internet technologies Familiarity with Amazon Web Services and other Cloud infrastructure platforms (e.g. GCP, Azure) Experience in monitoring, system performance data collection and analysis, and reporting Capability to write programs/scripts to solve both short-term systems problems and long term strategic objectives for the Atlas product A CS/CE degree or equivalent experience At least 2 of the following programming languages: Java, Go, Python, Typescript A keen interest in learning new skills and competencies To drive the personal growth and business impact of our employees, we’re committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees’ wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it’s like to work at MongoDB, and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer. Req ID 1263073054

Posted 3 days ago

Apply

5.0 years

0 Lacs

India

Remote

Full Stack Web Developer REMOTE Full Time - Permanent Position We are searching for dynamic and creative full stack web developers who have the knowledge and experience to help shape our category leading solutions. Our small growing team is responsible for developing and maintaining products and services that aid clients in the property insurance industry in quickly accessing reliable, current risk data enabling more informed business decisions while enhancing the customer experience. If you are looking to join a team that is vital to the success of a market leader, this is the perfect opportunity for you. What the job is like: You will work closely with a cross functional team of developers, QA engineers, product owners and designers to build cutting edge single page web applications. Work with modern technology stack including A ngular, TypeScript, Sass, HTML, Node/npm, Java, PostgreSQL, MongoDB and AWS. Designing, coding, testing, documenting, and debugging single page web applications to ensure they remain in “category killer” status Install, configure, and administer your own workstation software, including editors, development servers, and client applications Skills Required: Minimum of Bachelor’s Degree in Computer Science / Software Engineering or equivalent degree from a four-year college or university with minimum 3.0 GPA with 5+ years of related work experience Work experience with Java, JavaScript, Angular, TypeScript, HTML, CSS, Sass, XML, SQL Ability to create simple and well-designed solutions to complex software problems Dedication to excellence and championship work ethic Knowledge of internet client/server technologies and experience working building enterprise single page web applications Knowledge of PC and web based software testing in both client and server sides Team-player mindset with strong communication and problem solving skills Regards, Anand Swaroop Saicon Consultants, Inc. 913-358-6868 (USA - Work) Email : aswaroop@saicongroup.com URL: www.saiconinc.com LinkedIn WBE/MBE Inc. 500 Company 2006, 2007, 2008, 2009 Ranked #1 " Fastest-Growing Area Businesses" - Kansas City Business Journal Ranked in Top 10 of Corporate 100 - Ingram's CMMI Level 3 Assessed

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies