Home
Jobs

8288 Nosql Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 8.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Role Description: As a Senior Technical Lead - Front End – React , you will be responsible for developing user interfaces using ReactJS. You will be expected to have a strong understanding of HTML, CSS, JavaScript, and ReactJS. You should also have experience in working with state management libraries like Redux and MobX. Roles & Responsibilities: Strong proficiency in JavaScript, including DOM manipulation & java script object model Thorough understanding of React.JS, its core principles like Hooks, Lifecycle, etc. and workflows such asFlux / Redux Familiar in writing test cases and providing thorough test coverage Familiar with newer specifications of ECMA Scripts along with Bootstrap, HTML & CSS Experience in designing Restful APIs Hands-On with design patterns, error / exception handling & resource management Exposure to DevOps, associated CI/CD tools and code versioning tools like GIT Knowledge of modern authorization mechanisms like JSON Web Token Experience working with various data stores, SQL or NoSQL Decent knowledge of OOPS concepts Technical Skills Skills Requirements: Strong proficiency in React.js and JavaScript. Experience in front-end web development using HTML, CSS, and JavaScript frameworks. Knowledge of web design principles and web accessibility standards. Familiarity with software development life cycle (SDLC) and agile methodologies. Must have excellent communication skills and be able to communicate complex technical information tonon- technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Nice-to-have skills Qualifications Qualifications 8-10 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred Job Types: Full-time, Permanent Pay: Up to ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Schedule: Monday to Friday UK shift Supplemental Pay: Performance bonus Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Software development: 8 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person

Posted 1 day ago

Apply

6.0 years

10 Lacs

Hyderābād

On-site

GlassDoor logo

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 1 day ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Us ACKO is India’s first and only fully-digital Insurtech (product) company to have taken insurance by storm. You might have seen our cool ads or are already a customer and we hope you have noticed how we are rewriting the rules of the insurance game constantly and persistently. Based out of Bangalore, we are solving for the Indian market. But we are a part of a global wave of insurtech startups such as ZhongAn in China , Oscar, Lemonade, Metromile in the US, that are known to succeed owing to their business models and technology. We are a unicorn backed by a slate of marquee investors like Binny Bansal, Amazon, Ascent capital, Accel, SAIF, Catamaran, General Atlantic and Multiples. In only four years since our inception and operations, our products have reached ~75M unique users. We have partnered with some of the biggest names of the digital ecosystem such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc. At ACKO, job roles are focused at impact and we’re here to transform the way the industry operates. Innovation drives us and our products, and we are poised to disrupt insurance, powered by our pioneering products. We have changed the landscape of this age old sector in a growing economy like India and have miles to go from here. After having crossed the $1B valuation mark, our eyes set on even bigger milestones. If you think we’re just about growth and numbers, employee wellbeing lies at the core of all our programs and policies. We are a regular ‘Great Place to Work’ winner and consistently feature on Linkedin’s list of top startups. Currently 1000 strong, we are hiring across all functions. The Software Development Engineer - 3ʼs core responsibilities include designing, developing, leading by example, mentoring, and guiding team members on everything from structured problem-solving and architecting large systems to the development of best practices. You'd be working on technologies like Java, Python, Postgres, hazelcast, DynamoDB, SQL, lambda, Kubernetes, Cloud, etc., and highly maintainable and unit-tested software components/systems that address real-world problems. You will be working in a fast-paced and agile work environment delivering quality and scalable solutions that have an immediate business impact. Primary responsibilities: High-level design, development, and evolution management of complex features and subsystems Driving the adoption of best practices & regular participation in code reviews, design, documentation Monitoring and improvement of key engineering metrics such as uptime, performance, and modularity of subsystems Work closely with engineering and non-engineering stakeholders like the product, business, and third-party stakeholdersduring planning and throughout the SDLC to drive engineering in the right direction Collaborate within and outside the team to ensure engineering cohesiveness and consistency Mentor junior engineers and contribute to their success. Hereʼs what we are looking for: Experience level of 6-8 years in fairly complex/large-scale backend systems Strong problem-solving skills, design/architecture skills, and computer science fundamentals Strong hands-on and practical working experience with some high-level programming language(s), with a high focus on LLD & HLD Strong debugging skills, using logs and other monitoring systems Excellent coding skills - should be able to fluently convert the design into code. Hands-on experience working with some kinds of databases, caching, and queuing tools B.E. / B. Tech in Computer Science or equivalent from a reputed college. Practical coding knowledge of Java, Microservices, Distributed Systems Good to Have: Hands-on experience in using cloud infra - like AWS Practical coding knowledge of Python, React Understanding how a mobile app works end-to-end Have used tools for metrics and monitoring of the applications Sense of urgency and ownership Hands-on experience with one of the Postgres/MySql and some NoSQL databases Understanding of Security fundamentals - DDOS/API level security etc Understanding of Microservices architecture Knowledge of standard Queueing mechanisms Understanding of standard Caching mechanisms Understanding of Database Schema Design Show more Show less

Posted 1 day ago

Apply

2.0 - 5.0 years

6 Lacs

Hyderābād

On-site

GlassDoor logo

Must-Have Skills & Traits Core Engineering Advanced Python skills with a strong grasp of clean, modular, and maintainable code practices Experience building production-ready backend services using frameworks like FastAPI, Flask, or Django Strong understanding of software architecture , including RESTful API design, modularity, testing, and versioning. Experience working with databases (SQL/NoSQL), caching layers, and background job queues. AI/ML & GenAI Expertise Hands-on experience with machine learning workflows: data preprocessing, model training, evaluation, and deployment Practical experience with LLMs and GenAI tools such as OpenAI APIs, Hugging Face, LangChain, or Transformers Understanding of how to integrate LLMs into applications through prompt engineering, retrieval-augmented generation (RAG), and vector search Comfortable working with unstructured data (text, images) in real-world product environments Bonus: experience with model fine-tuning, evaluation metrics, or vector databases like FAISS, Pinecone, or Weaviate Ownership & Execution Demonstrated ability to take full ownership of features or modules from architecture to delivery Able to work independently in ambiguous situations and drive solutions with minimal guidance Experience collaborating cross-functionally with designers, PMs, and other engineers to deliver user-focused solutions Strong debugging, systems thinking, and decision-making skills with an eye toward scalability and performance Nice-to-Have Skills Experience in startup or fast-paced product environments. 2-5 years of relevant experience. Familiarity with asynchronous programming patterns in Python. Exposure to event-driven architecture and tools such as Kafka, RabbitMQ, or AWS EventBridge Data science exposure: exploratory data analysis (EDA), statistical modeling, or experimentation Built or contributed to agentic systems, ML/AI pipelines, or intelligent automation tools Understanding of MLOps: model deployment, monitoring, drift detection, or retraining pipelines Frontend familiarity (React, Tailwind) for prototyping or contributing to full-stack features

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 1 day ago

Apply

7.0 years

7 - 7 Lacs

Gurgaon

On-site

GlassDoor logo

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 1 day ago

Apply

3.0 years

8 - 9 Lacs

Gurgaon

On-site

GlassDoor logo

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. You will work as a Senior Engineer - Machine Learning and collaborate with data scientists and data engineers to: Build high-performance, cloud-native machine learning infrastructure and services to enable rapid innovation across United Build complex data ingestion and transformation pipelines for batch and real-time data Support large scale model training and serving pipelines in distributed and scalable environment This position is offered on local terms and conditions within United’s wholly owned subsidiary United Airlines Business Services Pvt. Ltd. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications Required BS/BA, in Advanced Computer Science, Data Science, Engineering or related discipline or Mathematics experience required Strong software engineering experience with Python and at least one additional language such as Go, Java, or C/C++ Familiarity with ML methodologies and frameworks (e.g., PyTorch, Tensorflow) and preferably building and deploying production ML pipelines Experience developing cloud-native solutions with Docker and Kubernetes Cloud-native DevOps, CI/CD experience using tools such as Jenkins or AWS CodePipeline; preferably experience with GitOps using tools such as ArgoCD, Flux, or Jenkins X Experience building real-time and event-driven stream processing pipelines with technologies such as Kafka, Flink, and Spark Experience setting up and optimizing data stores (RDBMS/NoSQL) for production use in the ML app context Strong desire to stay aligned with the latest developments in cloud-native and ML ops/engineering and to experiment with and learn new technologies Experience 3 + years of software engineering experience with languages such as Python, Go, Java, Scala, Kotlin, or C/C++ 2 + years of experience working in cloud environments (AWS preferred) 2 + years of experience with Big Data technologies such as Spark, Flink 2 + years of experience with cloud-native DevOps, CI/CD At least one year of experience with Docker and Kubernetes in a production environment Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field

Posted 1 day ago

Apply

3.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 1 day ago

Apply

4.0 years

5 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute: As an AI Verification Engineer, you will report to the [Insert Hiring Manager Title or Name], and work closely with cross-functional teams to ensure the quality, performance, and security of AI-powered components embedded within Ciena’s intelligent networking solutions. You will be a key player in validating AI/ML models, prompt engineering strategies, and knowledge base integrations to drive scalable and trustworthy AI solutions. Key responsibilities include: Designing and executing comprehensive test strategies and frameworks for LLM-powered AI agents and applications. Conducting adversarial and edge-case testing to ensure robustness and mitigate risks such as RAG poisoning or prompt injection. Validating the accuracy, concurrency, and effectiveness of RAG pipelines and knowledge base integrations. Engineering and optimizing prompts for generative models using techniques like zero-shot, few-shot, and chain-of-thought prompting. Collaborating with AI/ML and DevOps teams to resolve performance issues and contribute to continuous improvement efforts. The Must Haves: Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. 4+ years of experience in software testing, preferably focused on AI/ML or cloud-based systems. Proficient in Python or similar programming languages. Hands-on experience with AI/ML model testing methodologies (functional, performance, integration, security, metamorphic testing, etc.). Working knowledge of APIs, SQL/NoSQL databases, and CI/CD pipelines. Experience validating and troubleshooting large-scale datasets, data pipelines, and LLM applications. Understanding of AI vulnerabilities and risk mitigation strategies in model validation. Assets: Experience with prompt engineering, including iterative refinement and prompt performance evaluation. Familiarity with frameworks such as TensorFlow, PyTorch, or Google ADK. Exposure to testing and evaluating RAG pipelines and knowledge-grounded AI systems. Background in AI system security, including adversarial testing and prevention strategies. Strong communication skills and the ability to document reusable test and prompt strategies effectively. #LI-FA Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

TransUnion's Job Applicant Privacy Notice What We'll Bring Dot net Full Stack Engineer What You'll Bring Key Responsibilities - Develop and maintain front-end & back-end components of our fraud detection platform. Implement real time data processing and streaming functionalities Design and develop APIs for integrating various microservices Collaborate with cross-functional teams to deliver high quality software solutions Participate in entire application lifecycle , focusing on coding, debugging and testing Ensure the implementation of security protocols and data protection measures Stay UpToDate with emerging trends and technologies in AI/ML, fraud detection, and full stack development Required Qualifications - Bachelors or Masters degree in Computer Science, Engineering or a related field. Minimum of 5 yrs. of experience as a .Net Full Stack Developer. Strong proficiency in programming languages such as .Net, ASP, C and C# Experience with data streaming and processing tools (e.g. Apache Kafka, Spark) Solid experience with RDBMS and NoSQL database concepts Experience with developing RESTful or GraphQL APIs Familiarity with cloud platforms (GCP,AWS, Azure) and containerization tools (Docker, Kubernetes) Strong analytical and problem-solving skills. Impact You'll Make NA This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Developer, Software Development Show more Show less

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Locations: Bengaluru | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG's X team, you will work closely with consulting teams on a diverse range of advanced analytics and engineering topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytical and engineering subject matter expertise.As a Data Engineer, you will play a crucial role in designing, developing, and maintaining data pipelines, systems, and solutions that empower our clients to make informed business decisions. You will collaborate closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver high-quality data solutions that meet our clients' needs. YOU'RE GOOD AT Delivering original analysis and insights to case teams, typically owning all or part of an analytics module whilst integrating with a case team. Design, develop, and maintain efficient and robust data pipelines for extracting, transforming, and loading data from various sources to data warehouses, data lakes, and other storage solutions. Building data-intensive solutions that are highly available, scalable, reliable, secure, and cost-effective using programming languages like Python and PySpark. Deep knowledge of Big Data querying and analysis tools, such as PySpark, Hive, Snowflake and Databricks. Broad expertise in at least one Cloud platform like AWS/GCP/Azure.* Working knowledge of automation and deployment tools such as Airflow, Jenkins, GitHub Actions, etc., as well as infrastructure-as-code technologies like Terraform and CloudFormation. Good understanding of DevOps, CI/CD pipelines, orchestration, and containerization tools like Docker and Kubernetes. Basic understanding on Machine Learning methodologies and pipelines. Communicating analytical insights through sophisticated synthesis and packaging of results (including PPT slides and charts) with consultants, collecting, synthesizing, analyzing case team learning & inputs into new best practices and methodologies. Communication Skills: Strong communication skills, enabling effective collaboration with both technical and non-technical team members. Thinking Analytically You should be strong in analytical solutioning with hands on experience in advanced analytics delivery, through the entire life cycle of analytics. Strong analytics skills with the ability to develop and codify knowledge and provide analytical advice where required. What You'll Bring Bachelor's / Master's degree in computer science engineering/technology At least 4-6 years within relevant domain of Data Engineering across industries and work experience providing analytics solutions in a commercial setting. Consulting experience will be considered a plus. Proficient understanding of distributed computing principles including management of Spark clusters, with all included services - various implementations of Spark preferred. Basic hands-on experience with Data engineering tasks like productizing data pipelines, building CI/CD pipeline, code orchestration using tools like Airflow, DevOps etc.Good to have: - Software engineering concepts and best practices, like API design and development, testing frameworks, packaging etc. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge on web development technologies. Understanding of different stages of machine learning system design and development Who You'll Work With You will work with the case team and/or client technical POCs and border X team. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 1 day ago

Apply

2.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Summary: We are looking for a passionate and skilled Full Stack Developer with 2–3 years of hands-on experience in building scalable web applications using modern web technologies. The ideal candidate should have strong knowledge in both front-end and back-end development, particularly with JavaScript-based frameworks and NoSQL databases . Key Responsibilities: Design, develop, and maintain full-stack web applications. Write clean, maintainable, and efficient code using Node.js, React.js, and AngularJS . Create responsive front-end interfaces with HTML, CSS, and JavaScript . Integrate with NoSQL databases such as MongoDB or Opensearch. Collaborate with cross-functional teams including designers, product managers, and QA engineers. Troubleshoot, debug, and upgrade existing applications. Participate in code reviews and contribute to a culture of continuous improvement. Ensure security and data protection across applications. Required Skills: Proficient in Node.js , React.js , and AngularJS . Strong front-end skills with HTML , CSS , and JavaScript (ES6+) . Experience working with NoSQL databases (e.g., MongoDB, Opensearch). Working knowledge of Python for backend logic or scripting. Familiarity with RESTful APIs and asynchronous request handling. Good understanding of Git, version control, and development best practices. Strong problem-solving skills and the ability to work independently or in a team. Education UG: Graduate in Any Specialization (B.Tech/B.E.), PG: Post Graduation Not Required. Job Type: Full-time Pay: From ₹30,000.00 per month Benefits: Health insurance Schedule: Day shift Supplemental Pay: Performance bonus Experience: Node.js: 1 year (Required) React: 1 year (Required) Work Location: In person

Posted 1 day ago

Apply

2.0 years

10 - 12 Lacs

Gurgaon

On-site

GlassDoor logo

Job Overview We are looking for a dynamic and innovative Full Stack Data Scientist with 2+ years of experience who excels in end-to-end data science solutions. The ideal candidate is a tech-savvy professional passionate about leveraging data to solve complex problems, develop predictive models, and drive business impact in the MarTech domain. Key Responsibilities 1. Data Engineering & Preprocessing Collect, clean, and preprocess structured and unstructured data from various sources. Perform advanced feature engineering, outlier detection, and data transformation. Collaborate with data engineers to ensure seamless data pipeline development. 2. Machine Learning Model Development Design, train, and validate machine learning models (supervised, unsupervised, deep learning). Optimize models for business KPIs such as accuracy, recall, and precision. Innovate with advanced algorithms tailored to marketing technologies. 3. Full Stack Development Build production-grade APIs for model deployment using frameworks like Flask, FastAPI, or Django. Develop scalable and modular code for data processing and ML integration. 4. Deployment & Operationalization Deploy models on cloud platforms (AWS, Azure, or GCP) using tools like Docker and Kubernetes. Implement continuous monitoring, logging, and retraining strategies for deployed models. 5. Insight Visualization & Communication Create visually compelling dashboards and reports using Tableau, Power BI, or similar tools. Present insights and actionable recommendations to stakeholders effectively. 6. Collaboration & Teamwork Work closely with marketing analysts, product managers, and engineering teams to solve business challenges. Foster a collaborative environment that encourages innovation and shared learning. 7. Continuous Learning & Innovation Stay updated on the latest trends in AI/ML, especially in marketing automation and analytics. Identify new opportunities for leveraging data science in MarTech solutions. Qualifications Educational Background Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. Technical Skills Programming Languages: Python (must-have), R, or Julia; familiarity with Java or C++ is a plus. ML Frameworks: TensorFlow, PyTorch, Scikit-learn, or XGBoost. Big Data Tools: Spark, Hadoop, or Kafka. Cloud Platforms: AWS, Azure, or GCP for model deployment and data pipelines. Databases: Expertise in SQL and NoSQL (e.g., MongoDB, Cassandra). Visualization: Mastery of Tableau, Power BI, Plotly, or D3.js. Version Control: Proficiency with Git for collaborative coding. Experience 2+ years of hands-on experience in data science, machine learning, and software engineering. Proven expertise in deploying machine learning models in production environments. Experience in handling large datasets and implementing big data technologies. Soft Skills Strong problem-solving and analytical thinking. Excellent communication and storytelling skills for technical and non-technical audiences. Ability to work collaboratively in diverse and cross-functional teams. Preferred Qualifications Experience with Natural Language Processing (NLP) and Computer Vision (CV). Familiarity with CI/CD pipelines and DevOps for ML workflows. Exposure to Agile project management methodologies. Why Join Us? Opportunity to work on innovative projects with cutting-edge technologies. Collaborative and inclusive work environment that values creativity and growth. If you're passionate about turning data into actionable insights and driving impactful business decisions, we’d love to hear from you! Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Fixed shift Monday to Friday Experience: Data science: 2 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Technology: Java, Couchbase/NoSQL, Spring Boot, Microservices/REST API,, Message Broker (AMQ, WMQ), JDBC, CI/CD Pipeline,Cloud Technologies, Application Security, Database, Linux and some shell scripting, Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 day ago

Apply

5.0 years

0 Lacs

Mohali

On-site

GlassDoor logo

Apptunix is a leading Mobile App & Web Solutions development agency, based out of Texas, US. The agency empowers cutting-edge startups & enterprise businesses, paving the path for their incremental growth via technology solutions. Established in mid-2013, Apptunix has since then engaged in elevating the client’s interests & satisfaction through rendering improved and innovative Software and Mobile development solutions. The company strongly comprehends business needs and implements them by merging advanced technologies with its seamless creativity. Apptunix currently employs 250+ in-house experts who work closely & dedicatedly with clients to build solutions as per their customers' needs. Required Skills: - Deep Experience working on Node.js - Understanding of SQL and NoSQL database systems with their pros and cons - Experience working with databases like MongoDB. - Solid Understanding of MVC and stateless APIs & building RESTful APIs - Should have experience and knowledge of scaling and security considerations - Integration of user-facing elements developed by front-end developers with server-side logic - Good experience with ExpressJs, MongoDB, AWS S3 and ES6 - Writing reusable, testable, and efficient code - Design and implementation of low-latency, high-availability, and performance applications - Implementation of security and data protection - Integration of data storage solutions and Database structure - Good experience in Nextjs, Microservices, RabbitMQ, Sockets Experience: 5-8 years Job Type: Full-time Schedule: Monday to Friday Work Location: In person

Posted 1 day ago

Apply

3.0 years

0 - 0 Lacs

Mohali

On-site

GlassDoor logo

iTechnolabs is a software development company specializing in web applications, mobile apps and digital marketing services for businesses of all sizes. We help clients with consulting on technology and business strategies to achieve their goals and objectives Job description We are looking for a passionate, highly talented Python Developer with a minimum of 3+ years of experience to develop web applications using Django(Preferable)/flask. The candidate should be a strong individual contributor and be part of the development team and enjoys coding, thinks strongly about how things should be done. Must have: Hands-on experience in python Design, build and maintain efficient, reusable and reliable python code Hands-on experience in frameworks like Django or Flask Should adhere to best practices in Python/Django/Flask Good problem solving skills Knowledge of any of the databases. Relational (PostgreSQL, MySQL) or NoSQL (MongoDB). Testing and debugging software applications with Python test framework tools like Robot, PyTest, PyUnit, etc. Understanding of WebServer, Load Balancer, deployment process / activities, queuing mechanisms and background tasks. Good communication skills Good to have: Knowledge of front-end interfaces using HTML, CSS would be an added advantage. Familiarity with deployments using AWS will be a plus. Experience in Data Analytics/Data Science/Machine Learning. Working with Python libraries like Pandas, NumPy, etc. Familiarity with building RESTful APIs. Experience in monitoring and improving application performance will be a plus. Job Type: Full-time Pay: ₹30,000.00 - ₹90,000.00 per month Location Type: In-person Schedule: Day shift Education: Bachelor's (Preferred) Experience: Python: 4 years (Preferred) Work Location: In person

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Bhubaneshwar

On-site

GlassDoor logo

Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days

Posted 1 day ago

Apply

3.0 - 4.0 years

0 Lacs

Raipur

On-site

GlassDoor logo

Job Title: Backend Developer Experience: 3–4 Years Location: Raipur (On-site / Hybrid as applicable) Employment Type: Full-Time Job Summary: We are seeking a skilled and motivated Backend Developer with 3–4 years of hands-on experience in designing, developing, and maintaining robust backend systems. The ideal candidate will work closely with our frontend, DevOps, and product teams to build scalable and high-performance applications. Key Responsibilities: Develop and maintain secure, scalable backend services and APIs. Design database schemas, write optimized queries, and ensure data integrity. Integrate with third-party services and external APIs. Collaborate with frontend and DevOps teams to ensure seamless deployments. Troubleshoot and debug backend issues and implement effective solutions. Participate in code reviews, sprint planning, and architecture discussions. Write clean, maintainable, and well-documented code. Ensure best practices in security, performance, and scalability. Required Skills & Qualifications: 3–4 years of professional experience in backend development. Strong proficiency in one or more backend languages/frameworks: Node.js / Express.js Python / Django / Flask Java / Spring Boot Experience with RESTful APIs and microservice architecture. Strong understanding of relational and NoSQL databases (e.g., PostgreSQL, MongoDB). Experience with version control systems (Git). Familiarity with CI/CD pipelines and cloud platforms (AWS/GCP/Azure) is a plus. Solid understanding of software engineering principles and design patterns. Education: Bachelor’s or Master’s degree in Computer Science, IT, or a related field. Salary: As per industry standards and experience. Job Types: Full-time, Permanent Benefits: Paid sick time Location Type: In-person Schedule: Day shift Morning shift Education: Bachelor's (Required) Language: Hindi, English (Required) Location: Raipur, Chhattisgarh (Required) Work Location: In person

Posted 1 day ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Configure DataStax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable Preferred Technical And Professional Experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

0 Lacs

Noida

Remote

GlassDoor logo

ITX is seeking a Fullstack Developer with strong experience in building applications using Java and modern frontend technologies, with a particular emphasis on integrating Kafka for event-driven architectures. The ideal candidate will have expertise in developing robust and scalable backend services, as well as creating dynamic and engaging user interfaces. This role requires a collaborative mindset to work across cross-functional teams and the ability to thrive in an agile and evolving environment. Note: This role is restricted to candidates residing in India. Applications from other locations will not be considered for this position. Responsibilities: Develop and maintain end-to-end web applications using Java for the backend and modern frameworks such as React for the frontend. Build highly responsive user interfaces that integrate seamlessly with backend systems. Implement and maintain Kafka for building scalable and decoupled event-driven systems. Design and manage RESTful APIs , ensuring smooth interaction between frontend applications and third-party services. Collaborate with Product Managers, QA engineers, and other team members to deliver high-quality software solutions. Contribute to the entire software development lifecycle, from concept and design to testing, deployment, and maintenance. Participate in code reviews and provide constructive feedback to peers. Ensure code quality and application performance through effective unit testing and performance optimization. Required Skills: Strong proficiency in Java (Java 8 or above). Experience with Spring Boot for creating microservices and web applications. Solid understanding of Kafka and event-driven architectures. Familiarity with CI/CD practices using tools like Jenkins. Knowledge of relational databases (e.g., Oracle) and NoSQL databases (e.g., MongoDB). Proficient in building and consuming REST APIs . Experience with Agile/Scrum methodologies . Proficient in JavaScript and modern frontend frameworks (e.g., React). Nice-to-Have Skills: Familiarity with React or similar frontend frameworks. Experience with WebFlux and reactive programming. Exposure to DevOps tools and practices. Knowledge of other frontend frameworks like Angular . Experience working with distributed teams and remote collaboration. Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience. 5-7 years of experience in Java backend development. Previous experience working in a multicultural and distributed environment is a plus. Strong interpersonal and communication skills. English proficiency is required for communication with US-based teams.

Posted 1 day ago

Apply

2.0 years

0 - 0 Lacs

Nakūr

On-site

GlassDoor logo

Job Title : Backend Developer (On-Site) Location : Vishavkarma Chowk, Bai Pass, Near State Bank of India, Nakur, Saharanpur UP Company : Digital Innovations Salary : ₹10,000 – ₹25,000 per month (Based on experience) Experience : Fresher to 2+ years Job Summary : We are hiring backend developers who are passionate about building scalable web applications and APIs. The role involves working closely with the frontend and project team to create reliable and secure server-side logic and database architecture. Key Responsibilities : Develop and maintain backend code using PHP (Laravel), Java (Spring Boot), Python (Django), or Node.js Design and manage relational databases like MySQL/PostgreSQL or NoSQL like MongoDB Create, test, and deploy RESTful APIs Implement data security and protection best practices Collaborate with the frontend team to integrate APIs Debug, test, and improve application performance Required Skills : Strong understanding of backend development using any of the following: PHP (Laravel), Java, Python, or Node.js Basic knowledge of REST APIs and database management Familiarity with version control tools like Git Understanding of MVC architecture Knowledge of deployment and hosting is a plus Eligibility : Freshers and candidates with hands-on training/internships welcome Must be able to work full-time on-site in Nakur Why Join Us? Work on live projects across domains Learning-driven environment with full-stack exposure Team-oriented workplace with growth potential Working Days : Monday to Saturday (On-site) Job Type: Full-time Pay: ₹8,000.00 - ₹25,000.00 per month Location Type: In-person Schedule: Day shift Work Location: In person

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported. Team Coinbase is seeking a software engineer to join our India pod to drive the launch and growth of Coinbase in India. You will solve unique, large-scale, highly complex technical problems. You will help build the next generation of systems to make cryptocurrency accessible to everyone across multiple platforms (web, iOS, Android), operating real-time applications with high frequency and low latency updates, keeping the platform safe from fraud, enabling delightful experiences, and managing the most secure, containerized infrastructure running in the cloud. What you’ll be doing (i.e., job duties): Build high-performance services using Golang and gRPC, creating seamless integrations that elevate Coinbase's customer experience. Adopt, learn, and drive best practices in design techniques, coding, testing, documentation, monitoring, and alerting. Demonstrate a keen awareness of Coinbase’s platform, development practices, and various technical domains, and build upon them to efficiently deliver improvements across multiple teams. Add positive energy in every meeting and make your coworkers feel included in every interaction. Communicate across the company to both technical and non-technical leaders with ease. Deliver top-quality services in a tight timeframe by navigating seamlessly through uncertainties. Work with teams and teammates across multiple time zones. What we look for in you (i.e., job requirements): 3+ years of experience as a software engineer and 1+ years building backend services using Golang and gRPC. A self-starter capable of executing complex solutions with minimal guidance while ensuring efficiency and scalability. Proven experience integrating at least two third-party applications using Golang. Hands-on experience with AWS, Kubernetes, Terraform, Buildkite, or similar cloud infrastructure tools. Working knowledge of event-driven architectures (Kafka, MQ, etc.) and hands-on experience with SQL or NoSQL databases. Good understanding of gRPC, GraphQL, ETL pipelines, and modern development practices. Nice to haves: SaaS platform experience (Salesforce, Amazon Connect, Sprinklr). Experience with AWS, Kubernetes, Terraform, GitHub Actions, or similar tools. Familiarity with rate limiters, caching, metrics, logging, and debugging. Req ID - GCBE04IN Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here. Show more Show less

Posted 1 day ago

Apply

2.0 - 3.0 years

5 - 7 Lacs

Noida

On-site

GlassDoor logo

Noida,Uttar Pradesh,India Job ID 764288 Join our Team About this opportunity: Join Ericsson as an Oracle Database Administrator and play a key role in managing and optimizing our critical database infrastructure. As an Oracle DBA, you will be responsible for installing, configuring, Upgrading and maintaining Oracle databases, ensuring high availability, performance, and security. You’ll work closely with cross-functional teams to support business-critical applications, troubleshoot issues, and implement database upgrades and patches. This role offers a dynamic and collaborative environment where you can leverage your expertise to drive automation, improve efficiency, and contribute to innovative database solutions. What you will do: Oracle, PostgreSQL, MySQL, and/or MariaDB database administration in production environments. Experience with Container Databases (CDBs) and Pluggable Databases (PDBs) for better resource utilization and simplified management. High availability configuration using Oracle Dataguard, PostgreSQL, MySQL replication, and/or MariaDB Galera clusters. Oracle Enterprise Manager administration which includes alarm integration. Familiarity with Linux tooling such as iotop, vmstat, nmap, OpenSSL, grep, ping, find, df, ssh, and dnf. Familiarity with Oracle SQL Developer, Oracle Data Modeler, pgadmin, toad, PHP, MyAdmin, and MySQL Workbench is a plus. Familiarity with NoSQL, such as MongoDB is a plus. Knowledge of Middle ware like Golden-gate both oracle to oracle and oracle to BigData. Oracle, PostgreSQL, MySQL, and/or MariaDB database administration in production environments. Conduct detailed performance analysis and fine-tuning of SQL queries and stored procedures. Analyze AWR, ADDMreports to identify and resolve performance bottlenecks. Implement and manage backup strategies using RMAN and other industry-standard tools. Performing pre-patch validation using opatch and datapatch. Testing patches in a non-production environment to identify potential issues before applying to production. Apply Oracle quarterly patches and security updates. Implement and manage backup strategies using RMAN and other industry-standard tools. The skills you bring: Bachelor of Engineering or equivalent experience with at least 2 to 3 years in the field of IT. Must have experience in handling operations in any customer service delivery organization. Thorough understanding of basic framework of Telecom / IT processes. Willingness to work in a 24x7 operational environment with rotating shifts, including weekends and holidays, to support critical infra and ensure minimal downtime. Strong understanding of Linux systems and networking fundamentals. Knowledge of cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a plus. Oracle Certified Professional (OCP) is preferred Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?

Posted 1 day ago

Apply

2.0 years

2 - 9 Lacs

India

On-site

GlassDoor logo

EXPERIENCE- 2-5 YEARS We’re seeking a skilled Backend Engineer with strong expertise in systems architecture and database design . You'll be responsible for designing scalable backend systems, optimizing performance, and building efficient data models to support high-availability applications. RESPONSIBILITIES Design and implement scalable backend architectures Develop and optimize relational and NoSQL databases Build RESTful and/or GraphQL APIs Ensure system reliability, performance, and security Collaborate with frontend, DevOps, and product teams REQUIREMENTS 2+ years of backend development experience (Node.js) Strong understanding of system design principles and database architecture Proficiency with PostgreSQL, MySQL, MongoDB. Experience with cloud platforms (AWS) Excellent problem-solving and communication skills Experience with microservices and event-driven architecture Knowledge of CI/CD and containerization (Docker, Kubernetes) EDUCATION Bachelor's in Computer Science, IT, or equivalent experience. Monday-Friday Work from office only. Regards, Guddi Tamkoria Job Types: Full-time, Permanent Pay: ₹240,000.00 - ₹900,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 day ago

Apply

10.0 years

6 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Join our Team About this opportunity: We are seeking a highly skilled, hands-on AI Architect - GenAI to lead the design and implementation of production-grade, cloud-native AI and NLP solutions that drive business value and enhance decision-making processes. The ideal candidate will have a robust background in machine learning, generative AI, and the architecture of scalable production systems. As an AI Architect, you will play a key role in shaping the direction of advanced AI technologies and leading teams in the development of cutting-edge solutions. What you will do: Architect and design AI and NLP solutions to address complex business challenges and support strategic decision-making. Lead the design and development of scalable machine learning models and applications using Python, Spark, NoSQL databases, and other advanced technologies. Spearhead the integration of Generative AI techniques in production systems to deliver innovative solutions such as chatbots, automated document generation, and workflow optimization. Guide teams in conducting comprehensive data analysis and exploration to extract actionable insights from large datasets, ensuring these findings are communicated effectively to stakeholders. Collaborate with cross-functional teams, including software engineers and data engineers, to integrate AI models into production environments, ensuring scalability, reliability, and performance. Stay at the forefront of advancements in AI, NLP, and Generative AI, incorporating emerging methodologies into existing models and developing new algorithms to solve complex challenges. Provide thought leadership on best practices for AI model architecture, deployment, and continuous optimization. Ensure that AI solutions are built with scalability, reliability, and compliance in mind. The skills you bring: Minimum of 10+ years of experience in AI, machine learning, or a similar role, with a proven track record of delivering AI-driven solutions. Hands-on experience in designing and implementing end-to-end GenAI-based solutions, particularly in chatbots, document generation, workflow automation, and other generative use cases. Expertise in Python programming and extensive experience with AI frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, and vector databases. Deep understanding and experience with distributed data processing using Spark. Proven experience in architecting, deploying, and optimizing machine learning models in production environments at scale. Expertise in working with open-source Generative AI models (e.g., GPT-4, Mistral, Code-Llama, StarCoder) and applying them to real-world use cases. Expertise in designing cloud-native architectures and microservices for AI/ML applications. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Primary country and city: India (IN) || Kolkata Req ID: 763161

Posted 1 day ago

Apply

Exploring NoSQL Jobs in India

NoSQL is a rapidly growing field in India with plenty of job opportunities for skilled professionals. Companies across various industries are increasingly adopting NoSQL databases to handle massive amounts of data efficiently. If you are a job seeker interested in pursuing a career in NoSQL, here is a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industry and have a high demand for NoSQL professionals.

Average Salary Range

The average salary range for NoSQL professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with multiple years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

Typically, a career in NoSQL progresses as follows: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

With each role, you take on more responsibilities and work on more complex projects.

Related Skills

In addition to NoSQL expertise, other skills that are often expected or helpful in this field include: - Data modeling - Database administration - Cloud computing - Programming languages such as Java, Python, or JavaScript

Interview Questions

Here are 25 interview questions for NoSQL roles to help you prepare:

  • What is NoSQL and why is it used? (basic)
  • What are the different types of NoSQL databases? (basic)
  • Explain the CAP theorem. (medium)
  • What is eventual consistency? (medium)
  • How does NoSQL differ from SQL databases? (basic)
  • What is sharding in NoSQL databases? (medium)
  • Explain the concept of denormalization. (medium)
  • What is ACID in database systems? (basic)
  • What is the difference between document-oriented and key-value NoSQL databases? (medium)
  • How do you handle data consistency in a NoSQL database? (medium)
  • Explain the concept of secondary indexes. (medium)
  • What is MapReduce and how is it used in NoSQL databases? (medium)
  • How do you ensure data security in a NoSQL database? (medium)
  • What is the purpose of a distributed database? (medium)
  • What are the advantages of using NoSQL databases for big data applications? (medium)
  • Explain the concept of eventual consistency in NoSQL databases. (medium)
  • How do you handle transactions in a NoSQL database? (medium)
  • What are the common challenges of using NoSQL databases? (medium)
  • How do you optimize queries in a NoSQL database? (medium)
  • Explain the concept of horizontal scaling. (medium)
  • How would you design a schema in a document-oriented NoSQL database? (medium)
  • What is the role of indexes in a NoSQL database? (medium)
  • How do you ensure data durability in a NoSQL database? (medium)
  • What is the difference between NoSQL and NewSQL databases? (medium)
  • Can you explain the concept of eventual consistency? (medium)

Closing Remark

As you prepare for your journey into the world of NoSQL jobs in India, remember to stay updated on industry trends, continuously upskill yourself, and showcase your expertise confidently during interviews. With determination and dedication, you can land a rewarding career in the dynamic field of NoSQL. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies