Home
Jobs

8550 Nosql Jobs - Page 24

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters J͏ava react nextjs Lead Mandatory skills: Java, springboot, Microservices, React JS+nextjs/AIML Responsibilities include designing and developing high-volume, low-latency applications for mission-critical business systems / application services and modules. Delivering high-availability and performance. We expect them to contribute to all phases of the development lifecycle including writing well designed, testable, efficient code. Must be capable of working independently and collaboratively. Responsibilities: Developer responsibilities include, but are not limited to the following: Experience as a Sun Certified Java Developer with proven hands-on Software Development experience. We use Java 8 6-10 years java development experience with JSE/JEE, Java based Micro-services framework and implementation, Spring framework, Hibernate framework, SQL etc Hands on experience on Spring boot & SPARK Microservices and OSGi specifications Hands on experience on ReactJS Strong knowledge of micro-service logging, monitoring, debugging and testing Implementations experience of micro-service integration, packaging, build automation and deployment At least two years of experience in SOA & Micro services based process applications using BPM (Activiti/JBPM/Camunda) Object Oriented analysis and design using common design patterns. Insight of Java and JEE internals (Class loading, Memory Management, Transaction management etc) Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC, Spring Boot) Hands on experience with Relational and NOSQL databases (Mongo DB or Cassandra either one is must) Hands on experience in one of the cloud AWS, Google or Azure. ͏ Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 days ago

Apply

5.0 - 9.0 years

15 - 24 Lacs

Hyderābād

On-site

GlassDoor logo

We are hiring a .NET Full Stack Developer with 5 to 9 years of experience in .NET Core and frontend frameworks like React or Angular. Strong expertise in full stack development, API integration, and modern web technologies is required. Experience with Microservices Architecture – Proven ability to design, develop, and maintain scalable microservices using .NET Core and related tools. Database Proficiency – Strong hands-on experience with SQL Server and/or NoSQL databases like MongoDB, including writing complex queries, stored procedures, and performance tuning. DevOps & CI/CD Knowledge – Familiarity with CI/CD pipelines, version control (e.g., Git), and deployment tools like Azure DevOps, Jenkins, or GitHub Actions. Cloud Platform Experience – Exposure to cloud services, preferably Microsoft Azure or AWS, for hosting, storage, and authentication purposes. Agile/Scrum Environment – Ability to work in an Agile development environment, including participation in sprint planning, daily stand-ups, and code reviews. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,400,000.00 per year Schedule: Day shift Application Question(s): In how many days you can join if you get selected ? Are you interested to work from office? Work Location: In person

Posted 2 days ago

Apply

5.0 years

6 - 8 Lacs

Hyderābād

Remote

GlassDoor logo

- 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities • Ability to maintain and refine straightforward ETL and write secure, stable, testable, maintainable code with minimal defects and automate manual processes. • Proficiency in one or more industry analytics visualization tools (e.g. Excel, Tableau/Quicksight/PowerBI) and, as needed, statistical methods (e.g. t-test, Chi-squared) to deliver actionable insights to stakeholders. • Building and owning small to mid-size BI solutions with high accuracy and on time delivery using data sets, queries, reports, dashboards, analyses or components of larger solutions to answer straightforward business questions with data incorporating business intelligence best practices, data management fundamentals, and analysis principles. • Good understanding of the relevant data lineage: including sources of data; how metrics are aggregated; and how the resulting business intelligence is consumed, interpreted and acted upon by the business where the end product enables effective, data-driven business decisions. • Having high responsibility for the code, queries, reports and analyses that are inherited or produced and having analyses and code reviewed periodically. • Effective partnering with peer BIEs and others in your team to troubleshoot, research root causes, propose solutions, by either take ownership for their resolution or ensure a clear hand-off to the right owner. About the team The Global Operations – Artificial Intelligence (GO-AI) team is an initiative, which remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs and other new initiatives in partnership with global technology and operations teams. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 days ago

Apply

5.0 - 9.0 years

13 - 26 Lacs

Hyderābād

On-site

GlassDoor logo

We are hiring a Java Developer with 5 to 9 years of experience in Java, Spring Boot, and Microservices architecture. Strong skills in backend development, REST APIs, and scalable systems are essential. Experience with Cloud Platforms – Hands-on experience with cloud environments such as AWS, Azure, or Google Cloud, including deployment, monitoring, and scaling of services. Database Expertise – Proficiency in working with relational databases (like MySQL, PostgreSQL) and NoSQL databases (like MongoDB, Cassandra), including query optimization and data modeling. Proficient in CI/CD Tools – Familiarity with continuous integration and deployment tools such as Jenkins, GitLab CI/CD, or Maven, along with version control systems like Git. Strong Problem-Solving Skills – Ability to troubleshoot complex systems, debug code efficiently, and implement effective solutions with performance in mind. Agile Methodologies – Experience working in Agile/Scrum teams with participation in sprint planning, daily standups, and retrospectives. Job Type: Full-time Pay: ₹1,300,000.00 - ₹2,600,000.00 per year Location Type: In-person Schedule: Day shift Application Question(s): Are you interested to work from office ? In how many days you will join if you get selected ? Work Location: In person

Posted 2 days ago

Apply

15.0 years

4 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

Associate Vice President - Sr. Product/Solution Architect Role Overview : As a Sr. Product/Solution Architect , you will actively engage in your software architecture craft, taking a hands-on approach to multiple high-visibility projects, while also being the visionary and driving force behind our modern product technology strategy, roadmap, and implementation. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftsmanship and expert proficiency across multiple programming languages and modern frameworks, consistently demonstrating your exemplary track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a role model and engineering mentor, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Strategic Vision and Alignment: Craft and articulate a vision for modern product architecture as it specifically applies to the product engineering teams in alignment with the Business Strategy and US Deloitte Technology strategy, mapping business capabilities to the enterprise technology landscape. Collaborate with diverse stakeholders, including product, engineering, experience, delivery, security, and infrastructure teams across various organizational levels. Advocacy and Technology Roadmap: Advocate for, develop, and communicate engineering group’s integrated architecture/technology strategy and implementation approach to the product engineering teams and business stakeholders. Ensure the organization is well-informed about objectives, KPIs, technology roadmaps, and progress. Always have an eye on reuse and leverage of the existing technology assets to minimize overall costs. Craft Mastery and Objectives Realization: Define, measure, and drive the achievement of KPIs and NFRs related to product architecture and engineering, including aspects such as system performance, scalability, and maintainability. Establish and evolve product architecture and engineering domain reference architecture, standards, and best practices. Actively be hands-on with design, architecture, and code part of the time, contributing to team velocity, and be actively engaged with engineers across SSDLC. Review code, drive tech debt reduction, and experiment with new technologies, driving their adoption together with engineers, inspiring them to stay current with the technology industry evolution. Capability Evolution and Development: Being an engineering expert, mentor and develop engineers. Coach and develop skills in modern architecture and engineering practices, related to microservices, cloud-native design, containers, AI/ML/GenAI, DevSecOps, and deployment techniques like, Blue-Green, Canary to minimize down-time, enabling A/B testing approaches. Showcase learning and mastery by showcasing experiments internally, speaking at conferences, writing whitepapers or blogs, and leading R&D collaborations. Iterative Value Delivery: Embrace an iterative and incremental approach to product architecture and engineering. Apply a leaning-forward approach to navigate complexity and uncertainty. Ensure alignment with customer and business goals through iterative steps and empirical evidence, adjusting architecture direction to meet customer needs and business viability. Customer-Centric Problem Solving: Demonstrate a relentless focus on addressing the most critical issues faced by customers, aligning technical solutions with business objectives. Exhibit deep expertise in minimizing unnecessary technical complexities, features, and functionalities that do not add value (no “overengineering”). Drive teams toward peak performance through continuous learning and improvement. Expert Proficiency and Continuous Improvement: Possess a keen ability to identify inefficiencies and opportunities for innovation within the product development lifecycle. Continuously enhance the product engineering operating model to be lean, adaptable, and responsive to changes, ensuring that engineering teams can deliver business value efficiently and effectively. Guide and transform the organization to embrace lean principles and foster a culture of innovation. Tech/Quality Risk Management: Establish and evolve reference architectures, coding standards, and best architecture/engineering practices. Ensure that the product architecture designs support performance, scalability, and reliability/resilience requirements, including guidance for necessary optimizations. Identify potential technical risks and develop mitigation strategies via proactive problem-solving and contingency planning to address any issues that may arise during development. Influential Communication: Influence, persuade, and drive decision-making processes. Communicate effectively in both written and verbal forms. Craft clear, structured arguments and technical trade-offs, supported by evidence. Organizational Engagement and Collaboration: Engage stakeholders at all levels of the organization, from team members to middle management to executives. Build collaborative and constructive relationships, co-creating and driving momentum and value across multiple organizational levels. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Excellent software engineering and product architecture/design foundation with deep understanding of Business Context Diagrams (BCD), sequence/activity/state/ER/DFD diagrams, OOP/OOD, data structures, algorithms, code instrumentations, etc. 15+ years proven experience with programming languages like Angular, React, NodeJS, Python, Streamlit, C#, .NET Core, Golang, SQL/NoSQL, unit testing frameworks with 8+ years’ experience in architecting enterprise solutions on modern technology stacks. 8+ years of hands-on experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. 5+ years of hands-on experience with Azure cloud-native services specifically (e.g., API Management, Event Hub, Service Bus, Functions, Service Mash, Logic Apps, AKS, Batch, Istio, Archive Storage, Data Lakes, Synapse, SQL, Redis, CosmosDB, DocumentDB, PowerBI, Key Vault, Application Insights, etc. 3+ years of experience with AI/ML and GenAI is preferred. Deep understanding of methodologies & tools like, XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. to deliver high quality products rapidly. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304886

Posted 2 days ago

Apply

7.0 years

5 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 2 days ago

Apply

3.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more scripting language (e.g., Python, KornShell) - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience As part of the Last Mile Science & Technology organization, you’ll partner closely with Product Managers, Data Scientists, and Software Engineers to drive improvements in Amazon's Last Mile delivery network. You will leverage data and analytics to generate insights that accelerate the scale, efficiency, and quality of the routes we build for our drivers through our end-to-end last mile planning systems. You will develop complex data engineering solutions using AWS technology stack (S3, Glue, IAM, Redshift, Athena). You should have deep expertise and passion in working with large data sets, building complex data processes, performance tuning, bringing data from disparate data stores and programmatically identifying patterns. You will work with business owners to develop and define key business questions and requirements. You will provide guidance and support for other engineers with industry best practices and direction. Analytical ingenuity and leadership, business acumen, effective communication capabilities, and the ability to work effectively with cross-functional teams in a fast-paced environment are critical skills for this role. Key job responsibilities • Design, implement, and support data warehouse / data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational /Non-relational/No-SQL database) and message streams and construct complex analyses. • Develop and manage ETLs to source data from various systems and create unified data model for analytics and reporting • Perform detailed source-system analysis, source-to-target data analysis, and transformation analysis • Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with big data processing technology (e.g., Hadoop or ApacheSpark), data warehouse technical architecture, infrastructure components, ETL, and reporting/analytic tools and environments Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 days ago

Apply

8.0 years

4 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role As a Sr. Data Engineer in the Sales Automation Engineering team you should be able to work through the different areas of Data Engineering & Data Architecture including the following: Data Migration - From Hive/other DBs to Salesforce/other DBs and vice versa Data Modeling - Understand existing sources & data models and identify the gaps and building future state architecture Data Pipelines - Building Data Pipelines for several Data Mart/Data Warehouse and Reporting requirements Data Governance - Build the framework for DG & Data Quality Profiling & Reporting What the Candidate Will Need / Bonus Points - What the Candidate Will Do - Demonstrate strong knowledge of and ability to operationalize, leading data technologies and best practices. Collaborate with internal business units and data teams on business requirements, data access, processing/transformation and reporting needs and leverage existing and new tools to provide solutions. Build dimensional data models to support business requirements and reporting needs. Design, build and automate the deployment of data pipelines and applications to support reporting and data requirements. Research and recommend technologies and processes to support rapid scale and future state growth initiatives from the data front. Prioritize business needs, leadership questions, and ad-hoc requests for on-time delivery. Collaborate on architecture and technical design discussions to identify and evaluate high impact process initiatives. Work with the team to implement data governance, access control and identify and reduce security risks. Perform and participate in code reviews, peer inspections and technical design/specifications. Develop performance metrics to establish process success and work cross-functionally to consistently and accurately measure success over time Delivers measurable business process improvements while re-engineering key processes and capabilities and maps to future-state vision Prepare documentations and specifications on detailed design. Be able to work in a globally distributed team in an Agile/Scrum approach. - Basic Qualifications - Bachelor's Degree in computer science or similar technical field of study or equivalent practical experience. 8+ years professional software development experience, including experience in the Data Engineering & Architecture space Interact with product managers, and business stakeholders to understand data needs and help build data infrastructure that scales across the company Very strong SQL skills - know advanced level SQL coding (windows functions, CTEs, dynamic variables, Hierarchical queries, Materialized views etc) Experience with data-driven architecture and systems design knowledge of Hadoop related technologies such as HDFS, Apache Spark, Apache Flink, Hive, and Presto. Good hands on experience with Object Oriented programming languages like Python. Proven experience in large-scale distributed storage and database systems (SQL or NoSQL, e.g. HIVE, MySQL, Cassandra) and data warehousing architecture and data modeling. Working experience in cloud technologies like GCP, AWS, Azure Knowledge of reporting tools like Tableau and/or other BI tools. - Preferred Qualifications - Python libraries (Apache spark, Scala) Working experience in cloud technologies like GCP, AWS, Azure

Posted 2 days ago

Apply

2.0 years

5 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

About Reputation Reputation has changed the way companies improve their customer experience through feedback. Based in Silicon Valley and founded in 2013, Reputation is the only platform that empowers companies to fulfill their brand promise by measuring, managing, and scaling their reputation performance in real-time, everywhere. Functioning as a business’ eyes and ears in the spaces where customers talk, post, review, and recommend, Reputation AI-powered product stack analyzes vast amounts of public and private feedback data to uncover predictive insights for companies to act on, and improve their online reputations. Visit reputation.com to learn more. Reputation continues to earn recognition as a trusted leader in both innovation and partnership. Most recently, the company was named an Inc. Power Partner, a distinction awarded to B2B organizations with a proven track record of helping clients thrive. Reputation was also officially Certified™ as a Great Place to Work, reflecting its commitment to cultivating a world-class culture that fuels long-term success for employees and customers alike. Why Work at Reputation? Reputation has achieved substantial annual recurring revenue from Global Fortune 1000 companies and continues to grow worldwide. We've secured significant funding from A-list venture capital firms such as Bessemer Venture Partner and Kleiner Perkins, including a major equity financing from Marlin Equity Partners in January 2022. Reputation is trusted by more than 250 partners, including Google, Meta, Yelp, Apple Business Connect, Healthgrades and Entrata. The platform is used by major automotive OEMs and thousands of their new vehicle dealerships. Additionally hundreds of healthcare systems and their locations, along with top property management firms have integrated Reputation within their organizations. Our executive management team is committed to building a performance-based culture where excellence is rewarded and careers are developed. Who thrives at Reputation? Managers who embody a player-coach mentality. Employees who value teamwork and cross-functional collaboration. People who emphasize perseverance and hustle over quick wins and luck. Our Mission: Help businesses always know what their customers are saying about them and always act on that feedback. Reputation is seeking a Full Stack Software Engineer to help push our enterprise social media SaaS application forward. This position will work on a wide variety of projects relating to the social suite of products offered to our clients. We are looking for engineers who can build simple, fast, and elegant software. The Reputation Engineering team is small, flat, and close knit. We want to hear from you if you are ready to build your technical skill set in a fast-paced, CI/CD environment. Responsibilities: Build high-quality, clean, scalable and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.). Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation, and delivery of high-quality products to our users. Work in a fast-paced CI/CD Kanban environment and participate actively in feature development and bug resolution Capability to manage multiple projects with material-technical risk across teams and processes; may serve as a functional lead or technical owner. Work on several Reputation products to extend functionality and to maintain zero customer-reported bugs. Be a mentor for colleagues and help promote knowledge-sharing Additional duties as assigned. Qualifications: 2-5 years of experience in designing & implementing highly interactive UI for high-volume, robust web applications. Must be a graduate in BTECH/BE/MS/MTECH - IT/CS/Machine Leaning/Data Science/Artificial Intelligence Solid programming skills in JavaScript and Java/J2EE, with experience building reusable components using JavaScript libraries such as React and Node.js. Experience with Spring Boot for building scalable and efficient backend services. Proven ability to design, develop, and maintain microservices-based applications. Experience using GoLang to build and optimize data pipelines. Advanced knowledge of data structures, algorithms, object-oriented design, design patterns, and performance/scale considerations. Hands-on experience with NoSQL databases, such as MongoDB, Elasticsearch, and BigQuery, including development, troubleshooting, and performance optimization. Observability experience or willingness to learn. Experience working in a cloud environment and developing scalable, distributed systems. Strong sense of empathy for end-users, with a drive to enhance their experience. Comfortable working with data-intensive applications and performance-critical systems. We understand that not everyone will have experience with every technology, but familiarity with any of the following will help you stand out: GraphQL RabbitMQ Redis Elasticsearch Social Media APIs (Facebook, Instagram, LinkedIn, Twitter, YouTube, TikTok, etc) When you join Reputation, you can expect: Flexible working arrangements. Career growth with paid training tuition opportunities. Active Employee Resource Groups (ERGs) to engage with. An equitable work environment. We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability status. At Reputation, we’re committed to building a workforce that reflects a broad range of backgrounds, experiences, and perspectives. We believe that diversity strengthens our team, drives innovation, and helps us better serve our customers and communities. Through inclusive hiring practices and ongoing initiatives, we strive to create a workplace where everyone feels valued and empowered to contribute. Additionally, we offer a variety of benefits and perks, such as: Health Insurance & Wellness Benefits: Group Health Insurance: Medical Insurance with floater policy of up to 10,00,000 for employee + spouse + 2 dependent children + 2 parents / parent-in-laws Maternity Benefits: Medical insurance up to 75,000 INR, 26 weeks of leave for birth, adoption or surrogacy Life Insurance: Insurance at 3x annual cost to the company (Term Insurance, GPA) Accident/Disability Insurance: Insured at 3x base salary for permanent total disability, permanent partial disability and temporary total disability (GPA) OPD: of 7500 per annum per employee Leaves 10 Company observed holidays a year (Refer to the Holiday Calendar for the Year) 12 Casual/Sick leaves (Pro-rata calculated) 2 Earned Leaves per Month (Pro-rata calculated) 4 Employee Recharge days (aka company holiday/office closed) Maternity & Paternity (6 months) Bereavement Leave (10 Days) Car Lease: Reputation is offering a Car Lease Program that allows employees to lease a car with no upfront cost or down payment. They benefit from a fixed monthly lease rental and 20-30% tax savings. We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. To learn more about how we handle the personal data of applicants, visit our Candidate Privacy Notice . Applicants only - No 3rd party agency candidates.

Posted 2 days ago

Apply

10.0 years

7 - 20 Lacs

India

On-site

GlassDoor logo

About MostEdge At MostEdge , we’re on a mission to accelerate commerce and build sustainable, trusted experiences . Our slogan — Protect Every Penny. Power Every Possibility. —reflects our commitment to operational excellence, data integrity, and real-time intelligence that help retailers run smarter, faster, and stronger. Our systems are mission-critical and designed for 99.99999% uptime , powering millions of transactions and inventory updates daily . We work at the intersection of AI, microservices, and retail commerce—and we win as a team. Role Overview We are looking for a Senior Database Administrator (DBA) to own the design, implementation, scaling, and performance of our data infrastructure. You will be responsible for mission-critical OLTP systems spanning MariaDB, MySQL, PostgreSQL, and MongoDB , deployed across AWS, GCP, and containerized Kubernetes clusters . This role plays a key part in ensuring data consistency, security, and speed across billions of rows and real-time operations. Scope & Accountability What You Will Own Manage and optimize multi-tenant, high-availability databases for real-time inventory, pricing, sales, and vendor data. Design and maintain scalable, partitioned database architectures across SQL and NoSQL systems. Monitor and tune query performance and ensure fast recovery, replication, and backup practices. Partner with developers, analysts, and DevOps teams on schema design, ETL pipelines, and microservices integration . Maintain security best practices, audit logging, encryption standards, and data retention compliance . What Success Looks Like 99.99999% uptime maintained across all environments. <100ms query response times for large-scale datasets. Zero unplanned data loss or corruption incidents. Developer teams experience zero bottlenecks from DB-related delays. Skills & Experience Must-Have 10+ years of experience managing OLTP systems at scale. Strong hands-on with MySQL, MariaDB, PostgreSQL, and MongoDB . Proven expertise in replication, clustering, indexing, and sharding . Experience with Kubernetes-based deployments , Kafka queues , and Dockerized apps . Familiarity with AWS S3 storage , GCP services, and hybrid cloud data replication. Experience in startup environments with fast-moving agile teams. Track record of creating clear documentation and managing tasks via JIRA . Nice-to-Have Experience with AI/ML data pipelines , vector databases, or embedding stores. Exposure to infrastructure as code (e.g., Terraform, Helm). Familiarity with LangChain, FastAPI , or modern LLM-driven architectures. How You Reflect Our Values Lead with Purpose : You enable smarter, faster systems that empower our retail customers. Build Trust : You create safe, accurate, and recoverable environments. Own the Outcome : You take responsibility for uptime, audits, and incident resolution. Win Together : You collaborate seamlessly across product, ops, and engineering. Keep It Simple : You design intuitive schemas, efficient queries, and clear alerts. Why Join MostEdge? Work on high-impact systems powering real-time retail intelligence . Collaborate with a passionate, values-driven team across AI, engineering, and operations. Build at scale—with autonomy, ownership, and cutting-edge tech. Job Types: Full-time, Permanent Pay: ₹727,996.91 - ₹2,032,140.73 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 31/07/2025

Posted 2 days ago

Apply

0.0 - 8.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Role Description: As a Senior Technical Lead - Front End – React , you will be responsible for developing user interfaces using ReactJS. You will be expected to have a strong understanding of HTML, CSS, JavaScript, and ReactJS. You should also have experience in working with state management libraries like Redux and MobX. Roles & Responsibilities: Strong proficiency in JavaScript, including DOM manipulation & java script object model Thorough understanding of React.JS, its core principles like Hooks, Lifecycle, etc. and workflows such asFlux / Redux Familiar in writing test cases and providing thorough test coverage Familiar with newer specifications of ECMA Scripts along with Bootstrap, HTML & CSS Experience in designing Restful APIs Hands-On with design patterns, error / exception handling & resource management Exposure to DevOps, associated CI/CD tools and code versioning tools like GIT Knowledge of modern authorization mechanisms like JSON Web Token Experience working with various data stores, SQL or NoSQL Decent knowledge of OOPS concepts Technical Skills Skills Requirements: Strong proficiency in React.js and JavaScript. Experience in front-end web development using HTML, CSS, and JavaScript frameworks. Knowledge of web design principles and web accessibility standards. Familiarity with software development life cycle (SDLC) and agile methodologies. Must have excellent communication skills and be able to communicate complex technical information tonon- technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Nice-to-have skills Qualifications Qualifications 8-10 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred Job Types: Full-time, Permanent Pay: Up to ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Schedule: Monday to Friday UK shift Supplemental Pay: Performance bonus Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Software development: 8 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person

Posted 2 days ago

Apply

6.0 years

10 Lacs

Hyderābād

On-site

GlassDoor logo

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Us ACKO is India’s first and only fully-digital Insurtech (product) company to have taken insurance by storm. You might have seen our cool ads or are already a customer and we hope you have noticed how we are rewriting the rules of the insurance game constantly and persistently. Based out of Bangalore, we are solving for the Indian market. But we are a part of a global wave of insurtech startups such as ZhongAn in China , Oscar, Lemonade, Metromile in the US, that are known to succeed owing to their business models and technology. We are a unicorn backed by a slate of marquee investors like Binny Bansal, Amazon, Ascent capital, Accel, SAIF, Catamaran, General Atlantic and Multiples. In only four years since our inception and operations, our products have reached ~75M unique users. We have partnered with some of the biggest names of the digital ecosystem such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc. At ACKO, job roles are focused at impact and we’re here to transform the way the industry operates. Innovation drives us and our products, and we are poised to disrupt insurance, powered by our pioneering products. We have changed the landscape of this age old sector in a growing economy like India and have miles to go from here. After having crossed the $1B valuation mark, our eyes set on even bigger milestones. If you think we’re just about growth and numbers, employee wellbeing lies at the core of all our programs and policies. We are a regular ‘Great Place to Work’ winner and consistently feature on Linkedin’s list of top startups. Currently 1000 strong, we are hiring across all functions. The Software Development Engineer - 3ʼs core responsibilities include designing, developing, leading by example, mentoring, and guiding team members on everything from structured problem-solving and architecting large systems to the development of best practices. You'd be working on technologies like Java, Python, Postgres, hazelcast, DynamoDB, SQL, lambda, Kubernetes, Cloud, etc., and highly maintainable and unit-tested software components/systems that address real-world problems. You will be working in a fast-paced and agile work environment delivering quality and scalable solutions that have an immediate business impact. Primary responsibilities: High-level design, development, and evolution management of complex features and subsystems Driving the adoption of best practices & regular participation in code reviews, design, documentation Monitoring and improvement of key engineering metrics such as uptime, performance, and modularity of subsystems Work closely with engineering and non-engineering stakeholders like the product, business, and third-party stakeholdersduring planning and throughout the SDLC to drive engineering in the right direction Collaborate within and outside the team to ensure engineering cohesiveness and consistency Mentor junior engineers and contribute to their success. Hereʼs what we are looking for: Experience level of 6-8 years in fairly complex/large-scale backend systems Strong problem-solving skills, design/architecture skills, and computer science fundamentals Strong hands-on and practical working experience with some high-level programming language(s), with a high focus on LLD & HLD Strong debugging skills, using logs and other monitoring systems Excellent coding skills - should be able to fluently convert the design into code. Hands-on experience working with some kinds of databases, caching, and queuing tools B.E. / B. Tech in Computer Science or equivalent from a reputed college. Practical coding knowledge of Java, Microservices, Distributed Systems Good to Have: Hands-on experience in using cloud infra - like AWS Practical coding knowledge of Python, React Understanding how a mobile app works end-to-end Have used tools for metrics and monitoring of the applications Sense of urgency and ownership Hands-on experience with one of the Postgres/MySql and some NoSQL databases Understanding of Security fundamentals - DDOS/API level security etc Understanding of Microservices architecture Knowledge of standard Queueing mechanisms Understanding of standard Caching mechanisms Understanding of Database Schema Design Show more Show less

Posted 2 days ago

Apply

2.0 - 5.0 years

6 Lacs

Hyderābād

On-site

GlassDoor logo

Must-Have Skills & Traits Core Engineering Advanced Python skills with a strong grasp of clean, modular, and maintainable code practices Experience building production-ready backend services using frameworks like FastAPI, Flask, or Django Strong understanding of software architecture , including RESTful API design, modularity, testing, and versioning. Experience working with databases (SQL/NoSQL), caching layers, and background job queues. AI/ML & GenAI Expertise Hands-on experience with machine learning workflows: data preprocessing, model training, evaluation, and deployment Practical experience with LLMs and GenAI tools such as OpenAI APIs, Hugging Face, LangChain, or Transformers Understanding of how to integrate LLMs into applications through prompt engineering, retrieval-augmented generation (RAG), and vector search Comfortable working with unstructured data (text, images) in real-world product environments Bonus: experience with model fine-tuning, evaluation metrics, or vector databases like FAISS, Pinecone, or Weaviate Ownership & Execution Demonstrated ability to take full ownership of features or modules from architecture to delivery Able to work independently in ambiguous situations and drive solutions with minimal guidance Experience collaborating cross-functionally with designers, PMs, and other engineers to deliver user-focused solutions Strong debugging, systems thinking, and decision-making skills with an eye toward scalability and performance Nice-to-Have Skills Experience in startup or fast-paced product environments. 2-5 years of relevant experience. Familiarity with asynchronous programming patterns in Python. Exposure to event-driven architecture and tools such as Kafka, RabbitMQ, or AWS EventBridge Data science exposure: exploratory data analysis (EDA), statistical modeling, or experimentation Built or contributed to agentic systems, ML/AI pipelines, or intelligent automation tools Understanding of MLOps: model deployment, monitoring, drift detection, or retraining pipelines Frontend familiarity (React, Tailwind) for prototyping or contributing to full-stack features

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 2 days ago

Apply

7.0 years

7 - 7 Lacs

Gurgaon

On-site

GlassDoor logo

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 2 days ago

Apply

3.0 years

8 - 9 Lacs

Gurgaon

On-site

GlassDoor logo

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. You will work as a Senior Engineer - Machine Learning and collaborate with data scientists and data engineers to: Build high-performance, cloud-native machine learning infrastructure and services to enable rapid innovation across United Build complex data ingestion and transformation pipelines for batch and real-time data Support large scale model training and serving pipelines in distributed and scalable environment This position is offered on local terms and conditions within United’s wholly owned subsidiary United Airlines Business Services Pvt. Ltd. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains, compensates, and promotes regardless of race, religion, color, national origin, gender identity, sexual orientation, physical ability, age, veteran status, and other protected status as required by applicable law. Qualifications Required BS/BA, in Advanced Computer Science, Data Science, Engineering or related discipline or Mathematics experience required Strong software engineering experience with Python and at least one additional language such as Go, Java, or C/C++ Familiarity with ML methodologies and frameworks (e.g., PyTorch, Tensorflow) and preferably building and deploying production ML pipelines Experience developing cloud-native solutions with Docker and Kubernetes Cloud-native DevOps, CI/CD experience using tools such as Jenkins or AWS CodePipeline; preferably experience with GitOps using tools such as ArgoCD, Flux, or Jenkins X Experience building real-time and event-driven stream processing pipelines with technologies such as Kafka, Flink, and Spark Experience setting up and optimizing data stores (RDBMS/NoSQL) for production use in the ML app context Strong desire to stay aligned with the latest developments in cloud-native and ML ops/engineering and to experiment with and learn new technologies Experience 3 + years of software engineering experience with languages such as Python, Go, Java, Scala, Kotlin, or C/C++ 2 + years of experience working in cloud environments (AWS preferred) 2 + years of experience with Big Data technologies such as Spark, Flink 2 + years of experience with cloud-native DevOps, CI/CD At least one year of experience with Docker and Kubernetes in a production environment Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 2 days ago

Apply

4.0 years

5 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute: As an AI Verification Engineer, you will report to the [Insert Hiring Manager Title or Name], and work closely with cross-functional teams to ensure the quality, performance, and security of AI-powered components embedded within Ciena’s intelligent networking solutions. You will be a key player in validating AI/ML models, prompt engineering strategies, and knowledge base integrations to drive scalable and trustworthy AI solutions. Key responsibilities include: Designing and executing comprehensive test strategies and frameworks for LLM-powered AI agents and applications. Conducting adversarial and edge-case testing to ensure robustness and mitigate risks such as RAG poisoning or prompt injection. Validating the accuracy, concurrency, and effectiveness of RAG pipelines and knowledge base integrations. Engineering and optimizing prompts for generative models using techniques like zero-shot, few-shot, and chain-of-thought prompting. Collaborating with AI/ML and DevOps teams to resolve performance issues and contribute to continuous improvement efforts. The Must Haves: Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. 4+ years of experience in software testing, preferably focused on AI/ML or cloud-based systems. Proficient in Python or similar programming languages. Hands-on experience with AI/ML model testing methodologies (functional, performance, integration, security, metamorphic testing, etc.). Working knowledge of APIs, SQL/NoSQL databases, and CI/CD pipelines. Experience validating and troubleshooting large-scale datasets, data pipelines, and LLM applications. Understanding of AI vulnerabilities and risk mitigation strategies in model validation. Assets: Experience with prompt engineering, including iterative refinement and prompt performance evaluation. Familiarity with frameworks such as TensorFlow, PyTorch, or Google ADK. Exposure to testing and evaluating RAG pipelines and knowledge-grounded AI systems. Background in AI system security, including adversarial testing and prevention strategies. Strong communication skills and the ability to document reusable test and prompt strategies effectively. #LI-FA Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

TransUnion's Job Applicant Privacy Notice What We'll Bring Dot net Full Stack Engineer What You'll Bring Key Responsibilities - Develop and maintain front-end & back-end components of our fraud detection platform. Implement real time data processing and streaming functionalities Design and develop APIs for integrating various microservices Collaborate with cross-functional teams to deliver high quality software solutions Participate in entire application lifecycle , focusing on coding, debugging and testing Ensure the implementation of security protocols and data protection measures Stay UpToDate with emerging trends and technologies in AI/ML, fraud detection, and full stack development Required Qualifications - Bachelors or Masters degree in Computer Science, Engineering or a related field. Minimum of 5 yrs. of experience as a .Net Full Stack Developer. Strong proficiency in programming languages such as .Net, ASP, C and C# Experience with data streaming and processing tools (e.g. Apache Kafka, Spark) Solid experience with RDBMS and NoSQL database concepts Experience with developing RESTful or GraphQL APIs Familiarity with cloud platforms (GCP,AWS, Azure) and containerization tools (Docker, Kubernetes) Strong analytical and problem-solving skills. Impact You'll Make NA This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Developer, Software Development Show more Show less

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Locations: Bengaluru | Gurgaon Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG's X team, you will work closely with consulting teams on a diverse range of advanced analytics and engineering topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytical and engineering subject matter expertise.As a Data Engineer, you will play a crucial role in designing, developing, and maintaining data pipelines, systems, and solutions that empower our clients to make informed business decisions. You will collaborate closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver high-quality data solutions that meet our clients' needs. YOU'RE GOOD AT Delivering original analysis and insights to case teams, typically owning all or part of an analytics module whilst integrating with a case team. Design, develop, and maintain efficient and robust data pipelines for extracting, transforming, and loading data from various sources to data warehouses, data lakes, and other storage solutions. Building data-intensive solutions that are highly available, scalable, reliable, secure, and cost-effective using programming languages like Python and PySpark. Deep knowledge of Big Data querying and analysis tools, such as PySpark, Hive, Snowflake and Databricks. Broad expertise in at least one Cloud platform like AWS/GCP/Azure.* Working knowledge of automation and deployment tools such as Airflow, Jenkins, GitHub Actions, etc., as well as infrastructure-as-code technologies like Terraform and CloudFormation. Good understanding of DevOps, CI/CD pipelines, orchestration, and containerization tools like Docker and Kubernetes. Basic understanding on Machine Learning methodologies and pipelines. Communicating analytical insights through sophisticated synthesis and packaging of results (including PPT slides and charts) with consultants, collecting, synthesizing, analyzing case team learning & inputs into new best practices and methodologies. Communication Skills: Strong communication skills, enabling effective collaboration with both technical and non-technical team members. Thinking Analytically You should be strong in analytical solutioning with hands on experience in advanced analytics delivery, through the entire life cycle of analytics. Strong analytics skills with the ability to develop and codify knowledge and provide analytical advice where required. What You'll Bring Bachelor's / Master's degree in computer science engineering/technology At least 4-6 years within relevant domain of Data Engineering across industries and work experience providing analytics solutions in a commercial setting. Consulting experience will be considered a plus. Proficient understanding of distributed computing principles including management of Spark clusters, with all included services - various implementations of Spark preferred. Basic hands-on experience with Data engineering tasks like productizing data pipelines, building CI/CD pipeline, code orchestration using tools like Airflow, DevOps etc.Good to have: - Software engineering concepts and best practices, like API design and development, testing frameworks, packaging etc. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge on web development technologies. Understanding of different stages of machine learning system design and development Who You'll Work With You will work with the case team and/or client technical POCs and border X team. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Summary: We are looking for a passionate and skilled Full Stack Developer with 2–3 years of hands-on experience in building scalable web applications using modern web technologies. The ideal candidate should have strong knowledge in both front-end and back-end development, particularly with JavaScript-based frameworks and NoSQL databases . Key Responsibilities: Design, develop, and maintain full-stack web applications. Write clean, maintainable, and efficient code using Node.js, React.js, and AngularJS . Create responsive front-end interfaces with HTML, CSS, and JavaScript . Integrate with NoSQL databases such as MongoDB or Opensearch. Collaborate with cross-functional teams including designers, product managers, and QA engineers. Troubleshoot, debug, and upgrade existing applications. Participate in code reviews and contribute to a culture of continuous improvement. Ensure security and data protection across applications. Required Skills: Proficient in Node.js , React.js , and AngularJS . Strong front-end skills with HTML , CSS , and JavaScript (ES6+) . Experience working with NoSQL databases (e.g., MongoDB, Opensearch). Working knowledge of Python for backend logic or scripting. Familiarity with RESTful APIs and asynchronous request handling. Good understanding of Git, version control, and development best practices. Strong problem-solving skills and the ability to work independently or in a team. Education UG: Graduate in Any Specialization (B.Tech/B.E.), PG: Post Graduation Not Required. Job Type: Full-time Pay: From ₹30,000.00 per month Benefits: Health insurance Schedule: Day shift Supplemental Pay: Performance bonus Experience: Node.js: 1 year (Required) React: 1 year (Required) Work Location: In person

Posted 2 days ago

Apply

2.0 years

10 - 12 Lacs

Gurgaon

On-site

GlassDoor logo

Job Overview We are looking for a dynamic and innovative Full Stack Data Scientist with 2+ years of experience who excels in end-to-end data science solutions. The ideal candidate is a tech-savvy professional passionate about leveraging data to solve complex problems, develop predictive models, and drive business impact in the MarTech domain. Key Responsibilities 1. Data Engineering & Preprocessing Collect, clean, and preprocess structured and unstructured data from various sources. Perform advanced feature engineering, outlier detection, and data transformation. Collaborate with data engineers to ensure seamless data pipeline development. 2. Machine Learning Model Development Design, train, and validate machine learning models (supervised, unsupervised, deep learning). Optimize models for business KPIs such as accuracy, recall, and precision. Innovate with advanced algorithms tailored to marketing technologies. 3. Full Stack Development Build production-grade APIs for model deployment using frameworks like Flask, FastAPI, or Django. Develop scalable and modular code for data processing and ML integration. 4. Deployment & Operationalization Deploy models on cloud platforms (AWS, Azure, or GCP) using tools like Docker and Kubernetes. Implement continuous monitoring, logging, and retraining strategies for deployed models. 5. Insight Visualization & Communication Create visually compelling dashboards and reports using Tableau, Power BI, or similar tools. Present insights and actionable recommendations to stakeholders effectively. 6. Collaboration & Teamwork Work closely with marketing analysts, product managers, and engineering teams to solve business challenges. Foster a collaborative environment that encourages innovation and shared learning. 7. Continuous Learning & Innovation Stay updated on the latest trends in AI/ML, especially in marketing automation and analytics. Identify new opportunities for leveraging data science in MarTech solutions. Qualifications Educational Background Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. Technical Skills Programming Languages: Python (must-have), R, or Julia; familiarity with Java or C++ is a plus. ML Frameworks: TensorFlow, PyTorch, Scikit-learn, or XGBoost. Big Data Tools: Spark, Hadoop, or Kafka. Cloud Platforms: AWS, Azure, or GCP for model deployment and data pipelines. Databases: Expertise in SQL and NoSQL (e.g., MongoDB, Cassandra). Visualization: Mastery of Tableau, Power BI, Plotly, or D3.js. Version Control: Proficiency with Git for collaborative coding. Experience 2+ years of hands-on experience in data science, machine learning, and software engineering. Proven expertise in deploying machine learning models in production environments. Experience in handling large datasets and implementing big data technologies. Soft Skills Strong problem-solving and analytical thinking. Excellent communication and storytelling skills for technical and non-technical audiences. Ability to work collaboratively in diverse and cross-functional teams. Preferred Qualifications Experience with Natural Language Processing (NLP) and Computer Vision (CV). Familiarity with CI/CD pipelines and DevOps for ML workflows. Exposure to Agile project management methodologies. Why Join Us? Opportunity to work on innovative projects with cutting-edge technologies. Collaborative and inclusive work environment that values creativity and growth. If you're passionate about turning data into actionable insights and driving impactful business decisions, we’d love to hear from you! Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Fixed shift Monday to Friday Experience: Data science: 2 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Technology: Java, Couchbase/NoSQL, Spring Boot, Microservices/REST API,, Message Broker (AMQ, WMQ), JDBC, CI/CD Pipeline,Cloud Technologies, Application Security, Database, Linux and some shell scripting, Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 2 days ago

Apply

5.0 years

0 Lacs

Mohali

On-site

GlassDoor logo

Apptunix is a leading Mobile App & Web Solutions development agency, based out of Texas, US. The agency empowers cutting-edge startups & enterprise businesses, paving the path for their incremental growth via technology solutions. Established in mid-2013, Apptunix has since then engaged in elevating the client’s interests & satisfaction through rendering improved and innovative Software and Mobile development solutions. The company strongly comprehends business needs and implements them by merging advanced technologies with its seamless creativity. Apptunix currently employs 250+ in-house experts who work closely & dedicatedly with clients to build solutions as per their customers' needs. Required Skills: - Deep Experience working on Node.js - Understanding of SQL and NoSQL database systems with their pros and cons - Experience working with databases like MongoDB. - Solid Understanding of MVC and stateless APIs & building RESTful APIs - Should have experience and knowledge of scaling and security considerations - Integration of user-facing elements developed by front-end developers with server-side logic - Good experience with ExpressJs, MongoDB, AWS S3 and ES6 - Writing reusable, testable, and efficient code - Design and implementation of low-latency, high-availability, and performance applications - Implementation of security and data protection - Integration of data storage solutions and Database structure - Good experience in Nextjs, Microservices, RabbitMQ, Sockets Experience: 5-8 years Job Type: Full-time Schedule: Monday to Friday Work Location: In person

Posted 2 days ago

Apply

Exploring NoSQL Jobs in India

NoSQL is a rapidly growing field in India with plenty of job opportunities for skilled professionals. Companies across various industries are increasingly adopting NoSQL databases to handle massive amounts of data efficiently. If you are a job seeker interested in pursuing a career in NoSQL, here is a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industry and have a high demand for NoSQL professionals.

Average Salary Range

The average salary range for NoSQL professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with multiple years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

Typically, a career in NoSQL progresses as follows: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

With each role, you take on more responsibilities and work on more complex projects.

Related Skills

In addition to NoSQL expertise, other skills that are often expected or helpful in this field include: - Data modeling - Database administration - Cloud computing - Programming languages such as Java, Python, or JavaScript

Interview Questions

Here are 25 interview questions for NoSQL roles to help you prepare:

  • What is NoSQL and why is it used? (basic)
  • What are the different types of NoSQL databases? (basic)
  • Explain the CAP theorem. (medium)
  • What is eventual consistency? (medium)
  • How does NoSQL differ from SQL databases? (basic)
  • What is sharding in NoSQL databases? (medium)
  • Explain the concept of denormalization. (medium)
  • What is ACID in database systems? (basic)
  • What is the difference between document-oriented and key-value NoSQL databases? (medium)
  • How do you handle data consistency in a NoSQL database? (medium)
  • Explain the concept of secondary indexes. (medium)
  • What is MapReduce and how is it used in NoSQL databases? (medium)
  • How do you ensure data security in a NoSQL database? (medium)
  • What is the purpose of a distributed database? (medium)
  • What are the advantages of using NoSQL databases for big data applications? (medium)
  • Explain the concept of eventual consistency in NoSQL databases. (medium)
  • How do you handle transactions in a NoSQL database? (medium)
  • What are the common challenges of using NoSQL databases? (medium)
  • How do you optimize queries in a NoSQL database? (medium)
  • Explain the concept of horizontal scaling. (medium)
  • How would you design a schema in a document-oriented NoSQL database? (medium)
  • What is the role of indexes in a NoSQL database? (medium)
  • How do you ensure data durability in a NoSQL database? (medium)
  • What is the difference between NoSQL and NewSQL databases? (medium)
  • Can you explain the concept of eventual consistency? (medium)

Closing Remark

As you prepare for your journey into the world of NoSQL jobs in India, remember to stay updated on industry trends, continuously upskill yourself, and showcase your expertise confidently during interviews. With determination and dedication, you can land a rewarding career in the dynamic field of NoSQL. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies