Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Description Retail Business Services (RBS) supports Amazons Retail business growth WW through three core tasks These are (a) Selection, where RBS sources, creates and enrich ASINs to drive GMS growth; (b) Defect Elimination: where RBS resolves inbound supply chain defects and develops root cause fixes to improve free cash flow and (c) supports operational process for WW Retail teams where there is an air gap in the tech stack The tech team in RBS develops automation that leverages Machine/Deep Learning to scale execution of these high complex tasks that currently require human cognitive skills Our solutions ensure that information in Amazon's catalog is complete, correct and, comprehensive enough to give Amazon customers a great shopping experience every time That's where you can help, We are looking for a sharp, experienced Application Engineer (AE) with a diverse skillset and background As an AE, you will work directly with our business teams to solve their support needs with the existing applications and collect requirements and ways to solve highly scalable solutions in collaboration with other technical teams You will play an active role in translating business and functional requirements into concrete deliverables and building scalable systems You will also contribute to maintain the services healthy and robust You will be responsible for implementing, and maintaining the solutions you provide You will work closely with engineers on maintaining multiple products and services, creating process automation scripts , monitoring and handling ad-hoc operational asks, Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Experience troubleshooting and debugging technical systems Experience in Unix Experience scripting in modern program languages Knowledge of Python, PySpark, Big Data and SQL Queries Preferred Qualifications Knowledge of web services, distributed systems, and web application development Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, amazon jobs / content / en / how-we-hire / accommodations for more information If the country/region youre applying in isnt listed, please contact your Recruiting Partner, Company ADCI BLR 14 SEZ Job ID: A3032552 Show
Posted 4 days ago
2.0 - 6.0 years
5 - 10 Lacs
Bengaluru
Work from Office
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career, Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express, How will you make an impact in this role In the role of Engineer I, you will be responsible for taking on the role of an individual contractor for the GCP applications which is critical in the Amex environment, Engineering Development strategic frameworks, processes, tools and actionable insights, As a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework / services / application / pipelines for processing huge volume of data You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs, GCP Architecture design and build solutions, SQL, PySpark, Python, Cloud technologies Design and develop solutions using Bigdata tools and technologies like MapReduce, Hive, Spark etc Ensure the performance, quality, and responsiveness of solutions, Participate in code reviews to maintain code quality, Conduct IT requirements gathering, Define problems and provide solution alternatives, Create detailed computer system design documentation, Implement deployment plan, Conduct knowledge transfer with the objective of providing high-quality IT consulting solutions Support consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation, design and deployment, Under supervision participate in unit-level and organizational initiatives with the objective of providing high-quality and value adding consulting solutions, Understand issues and diagnose root-cause of issues Perform secondary research as instructed by supervisor to assist in strategy and business planning, Minimum Qualifications: 8+ years of experience in cloud applications with experience in leading a team Industry knowledge on GCP Cloud Applications and deployment, Bachelors degree in Computer Science Engineering, or a related field, Should be able to write shell scripts, Utilize Git for source version control, Set up and maintain CI/CD pipelines, Troubleshoot, debug, and upgrade existing application & ETL job chains, Ability to effectively interpret technical and business objectives and challenges and articulate solutions Experience with managing teams and balance multiple priorities, Willingness to learn new technologies and exploit them to their optimal potential Strong experience with Data Engineering, Big Data Applications Strong background with Python, PySpark , Java , Airflow , Spark , PL/SQL, Airflow Dags Cloud experience with GCP is must Excellent communication and analytical skills Excellent team-player with ability to work with global team Preferred Qualifications: Proven experience as Data Engineer or similar role, Strong proficiency in Object Oriented programming using Python, Experience with ETL jobs design principles, Solid understanding of HQL, SQL and data modelling, Knowledge on Unix/Linux and Shell scripting principles, We back you with benefits that support your holistic well-being so you can be and deliver your best This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law, Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations, Show
Posted 4 days ago
0 years
0 Lacs
India
Remote
Ready to embark on a journey where your growth is intertwined with our commitment to making a positive impact? Join the Delphi family - where Growth Meets Values. At Delphi Consulting Pvt. Ltd. , we foster a thriving environment with a hybrid work model that lets you prioritize what matters most. Interviews and onboarding are conducted virtually, reflecting our digital-first mindset . We specialize in Data, Advanced Analytics, AI, Infrastructure, Cloud Security , and Application Modernization , delivering impactful solutions that drive smarter, efficient futures for our clients. About the Role: The QA/Test Engineer will validate different functionalities of the platform including APIs, data ingestion pipelines, backend services, infrastructure components, Azure services to ensure the endto-end platform is robust and secure. What you'll do: Create and execute test plans for all components Automate API and data testing Perform security, performance, and integration testing Work closely with data engineers, API developers, and stakeholders to execute test cases What you'll bring: Create detailed test cases covering platform, infra, data, and APIs Perform API testing using Postman, Swagger, and Azure API Management Develop data validation scripts (PySpark/Python/SQL) Automate testing and integrate with CI/CD pipelines Implement data quality checks and recommend Azure-compatible testing tools Conduct unit, integration, performance, and UAT testing Document and share test results with stakeholders What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company-sponsored certifications, training programs , and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from-home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported.
Posted 4 days ago
5.0 - 8.0 years
14 - 24 Lacs
Noida, Pune, Bengaluru
Hybrid
Mandatory skill- Azure Databricks, Datafactory, Pyspark, SQL Experience- 5 to 8 years Location- Mumbai/Bangalore/Pune/Chennai/Hyderabad/Indore/Kolkata/Noida/Coimbatore/Bhuvaneswar Key Responsibilities: Design and build data pipelines and ETL/ELT workflows using Azure Databricks and Azure Data Factory Ingest, clean, transform, and process large datasets from diverse sources (structured and unstructured) Implement Delta Lake solutions and optimize Spark jobs for performance and reliability Integrate Azure Databricks with other Azure services including Data Lake Storage, Synapse Analytics, and Event Hubs Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 4 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Overview We are looking for highly skilled with 4 to 5 years experienced Generative AI Engineer to design and deploy enterprise-grade GenAI systems. This role blends platform architecture, LLM integration, and operationalization—ideal for engineers with strong hands-on experience in large language models, RAG pipelines, and AI orchestration. Responsibilities Platform Leadership: Architect GenAI platforms powering copilots, document AI, multi-agent systems, and RAG pipelines. LLM Expertise: Build/fine-tune GPT, Claude, Gemini, LLaMA 2/3, Mistral; deep in RLHF, transformer internals, and multi-modal integration. RAG Systems: Develop scalable pipelines with embeddings, hybrid retrieval, prompt orchestration, and vector DBs (Pinecone, FAISS, pgvector). Orchestration & Hosting: Lead LLM hosting, LangChain/LangGraph/AutoGen orchestration, AWS SageMaker/Bedrock integration. Responsible AI: Implement guardrails for PII redaction, moderation, lineage, and access aligned with enterprise security standards. LLMOps/MLOps: Deploy CI/CD pipelines, automate tuning/rollout, handle drift, rollback, and incidents with KPI dashboards. Cost Optimization: Reduce TCO via dynamic routing, GPU autoscaling, context compression, and chargeback tooling. Agentic AI: Build autonomous, critic-supervised agents using MCP, A2A, LGPL patterns. Evaluation: Use LangSmith, BLEU, ROUGE, BERTScore, HIL to track hallucination, toxicity, latency, and sustainability. Skills Required 4–5 years in AI/ML (2+ in GenAI) Strong Python, PySpark, Scala; APIs via FastAPI, GraphQL, gRPC Proficiency with MLflow, Kubeflow, Airflow, Prompt flow Experience with LLMs, vector DBs, prompt engineering, MLOps Solid foundation in applied mathematics & statistics Nice to Have Open-source contributions, AI publications Hands-on with cloud-native GenAI deployment Deep interest in ethical AI and AI safety 2 Days WFO Mandatory Don't meet every job requirement? That's okay! Our company is dedicated to building a diverse, inclusive, and authentic workplace. If you're excited about this role, but your experience doesn't perfectly fit every qualification, we encourage you to apply anyway. You may be just the right person for this role or others.
Posted 4 days ago
6.0 - 11.0 years
12 - 17 Lacs
Pune
Work from Office
Roles and Responsibility The Senior Tech Lead - Databricks leads the design, development, and implementation of advanced data solutions. Has To have extensive experience in Databricks, cloud platforms, and data engineering, with a proven ability to lead teams and deliver complex projects. Responsibilities: Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture
Posted 4 days ago
2.0 - 3.0 years
4 - 6 Lacs
Bengaluru
Work from Office
Job Title: Python Developer Machine Learning & AI (2-3 Years Experience) Job Summary: We are seeking a skilled and motivated Python Developer with 2 to 3 years of experience in Machine Learning and Artificial Intelligence. The ideal candidate will have hands-on experience in developing, training, and deploying machine learning models, and should be proficient in Python and associated data science libraries. You will work with our data science and engineering teams to build intelligent solutions that solve real-world problems. Key Responsibilities: Develop and maintain machine learning models using Python. Work on AI-driven applications, including predictive modeling, natural language processing, and computer vision (based on project requirements). Collaborate with cross-functional teams to understand business requirements and translate them into ML solutions. Preprocess, clean, and transform data for training and evaluation. Perform model training, tuning, evaluation, and deployment using tools like scikit-learn, TensorFlow, or PyTorch. Write modular, efficient, and testable code. Document processes, models, and experiments clearly for team use and future reference. Stay updated with the latest trends and advancements in AI and machine learning. Required Skills: 2-3 years of hands-on experience with Python programming. Solid understanding of machine learning algorithms (supervised, unsupervised, and reinforcement learning). Experience with libraries such as scikit-learn , pandas , NumPy , Matplotlib , and Seaborn . Exposure to deep learning frameworks like TensorFlow , Keras , or PyTorch . Good understanding of data structures and algorithms. Experience with model evaluation techniques and performance metrics. Familiarity with Jupyter Notebooks, version control (Git), and cloud platforms (AWS, GCP, or Azure) is a plus. Strong analytical and problem-solving skills. Preferred Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Mathematics, or related field. Experience with deploying ML models using Flask , FastAPI , or Docker . Knowledge of MLOps and model lifecycle management is an advantage. Understanding of NLP or Computer Vision is a plus.
Posted 4 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Expedia Group brands power global travel for everyone, everywhere. We design cutting-edge tech to make travel smoother and more memorable, and we create groundbreaking solutions for our partners. Our diverse, vibrant, and welcoming community is essential in driving our success. Why Join Us? To shape the future of travel, people must come first. Guided by our Values and Leadership Agreements, we foster an open culture where everyone belongs, differences are celebrated and know that when one of us wins, we all win. We provide a full benefits package, including exciting travel perks, generous time-off, parental leave, a flexible work model (with some pretty cool offices), and career development resources, all to fuel our employees' passion for travel and ensure a rewarding career journey. We’re building a more open world. Join us. Senior Manager , Marketing Analytics (Bangalore) Introduction to team The Traveler Business Team builds and drives growth for our global consumer businesses—Expedia, Hotels.com, and Vrbo. This division creates compelling and differentiated traveler value for each brand by setting the strategic vision, operating strategy, and plan. Responsibilities include investment allocation and prioritization, P&L accountability, and leading cross-functional teams across Expedia Group, who are all held accountable to a single scorecard. A leadership figure dedicated to shared success and committed to achieving excellence, guiding both junior and senior analysts. Manages work across distinct domains and capability areas with clear guidance from senior management, collaborating with senior leaders and stakeholders. Aspires to foster a high-performing and efficiently managed team with a focus on delivering high-quality results. Additionally, consistently offers constructive feedback and mentorship to enhance analytical skillsets within the immediate and broader team. In This Role You Will In this role, you will be responsible for leading a team which automates and extracts trends from our channel performance datasets and transforms them into data products, performance insights, and actionable strategies that improve our channel performance. You will use analytical thinking and deep knowledge of data to turn complex trends into compelling narratives and recommendations for channel operations. Acting as a key source of truth for all channel performance data-related matters within Marketing Analytics, you will guide the consumption of our in-house measurement to inform marketing partners and shape business decisions. Data Analysis & Insights: Support a large network of stakeholders by analysing attribution and channel operations data to generate insights and explanations for performance trends. Inform the business clearly of the commercial implications of performance changes. Self-Service Enablement: Develop and deliver self-service analytics products for a wide variety of stakeholders to increase the accessibility, speed and simplicity of transforming data trends into commercial insights. Data Quality Investigations: Employ logical thinking and root-cause analysis to distinguish changes in customer behaviour from underlying data quality. Act as the subject matter expert for datasets powering our channel measurement. Project management: Balance the delivery long-term roadmap deliverables against ad-hoc and high urgency investigations. Identify opportunities to proactively anticipate requests for support. Stakeholder partnership: Partnering closely with Channel Teams, Finance, and Data Engineering, you will act as the expert mediator between our channel measurement data and the teams that rely on them to understand commercial performance. Experience & Qualifications PhD, Masters or Bachelors (pref for Mathematics or Scientific degree) with 4-7 years work experience OR 7+ years of experience in a comparable data analytics role with relevant experience 2-4 years of managing marketing analytical teams Managed or mentored at least 1 Data Scientist I or Data Scientist II Strong SQL skills; demonstrated experience of using PySpark / Python to structure, transform and visualize big data, and a willingness to learn new frameworks and languages required for the task Deep logical thinking and experience in distilling crisp insights from highly complex datasets. Experience with designing, delivering, and maintaining data visualisations products through tools like Tableau and Power BI. Experience partnering with other teams and disciplines (Finance, Channels, Engineering, etc.) and collaborating with other analytics teams to deliver projects. Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please reach out to our Recruiting Accommodations Team through the Accommodation Request. We are proud to be named as a Best Place to Work on Glassdoor in 2024 and be recognized for award-winning culture by organizations like Forbes, TIME, Disability:IN, and others. Expedia Group's family of brands includes: Brand Expedia®, Hotels.com®, Expedia® Partner Solutions, Vrbo®, trivago®, Orbitz®, Travelocity®, Hotwire®, Wotif®, ebookers®, CheapTickets®, Expedia Group™ Media Solutions, Expedia Local Expert®, CarRentals.com™, and Expedia Cruises™. © 2024 Expedia, Inc. All rights reserved. Trademarks and logos are the property of their respective owners. CST: 2029030-50 Employment opportunities and job offers at Expedia Group will always come from Expedia Group’s Talent Acquisition and hiring teams. Never provide sensitive, personal information to someone unless you’re confident who the recipient is. Expedia Group does not extend job offers via email or any other messaging tools to individuals with whom we have not made prior contact. Our email domain is @expediagroup.com. The official website to find and apply for job openings at Expedia Group is careers.expediagroup.com/jobs. Expedia is committed to creating an inclusive work environment with a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, gender, sexual orientation, national origin, disability or age.
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Immediate #HIRING for a highly motivated and experienced GCP Data Engineer to join our growing team. We’re a leading software company specializing in Artificial Intelligence, Machine Learning, Data Analytics, Innovative data solutions, Cloud-based technologies If you're passionate about building robust applications and thrive in a dynamic environment, please share your resume a t rizwana@randomtrees.com Job Title: GCP Data Engineer Experience : 4 Yrs - 8Yrs Notice: Immediate Location: Hyderabad/ Chennai - Hybrid Mode Job Type: Full-time Employment Job Description: We are looking for an experienced GCP Data Engineer to design, develop, and optimize data pipelines and solutions on Google Cloud Platform (GCP) . The ideal candidate should have hands-on experience with BigQuery, DataFlow, PySpark, GCS, and Airflow (Cloud Composer) , along with strong expertise or knowledge in DBT. Key Responsibilities: Design and develop scalable ETL/ELT data pipelines using DataFlow (Apache Beam), PySpark, and Airflow (Cloud Composer) . Work extensively with BigQuery for data transformation, storage, and analytics. Implement data ingestion, processing, and transformation workflows using GCP-native services. Optimize and troubleshoot performance issues in BigQuery and DataFlow pipelines. Manage data storage and governance using Google Cloud Storage (GCS) and other GCP services. Ensure data quality, security, and compliance with industry standards. Work closely with data scientists, analysts, and business teams to provide data solutions. Automate workflows, monitor jobs, and improve pipeline efficiency. Required Skills: ✔ Google Cloud Platform (GCP) Data Engineering (GCP DE Certification preferred) DBT knowledge or experience is mandate ✔ BigQuery – Data modeling, query optimization, and performance tuning ✔ PySpark – Data processing and transformation ✔ GCS (Google Cloud Storage) – Data storage and management ✔ Airflow / Cloud Composer – Workflow orchestration and scheduling ✔ SQL & Python – Strong hands-on experience ✔ Experience with CI/CD pipelines, Terraform, or Infrastructure as Code (IaC) is a plus.
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
Noida, Pune, Gurugram
Hybrid
IRIS Software Prominent IT Company is looking for Senior AWS Data Engineer. Please find below Job description and share me your updated resume at Prateek.gautam@irissoftware.com. Role: Senior AWS Data Engineer Location: Pune / Noida / Gurgaon Hybrid : 3 days office , 2 days work from home Job Description: 6 to 10 years of experience in Overall years of experience. Good experience in Data engineering is required. Good experience in AWS, SQL, AWS Glue, PySpark, Airflow, CDK, Redshift is required. Good communications skills is required. About Iris Software Inc. With 4,000+ associates and offices in India, U.S.A. and Canada, Iris Software delivers technology services and solutions that help clients complete fast, far-reaching digital transformations and achieve their business goals. A strategic partner to Fortune 500 and other top companies in financial services and many other industries, Iris provides a value-driven approach - a unique blend of highly-skilled specialists, software engineering expertise, cutting-edge technology, and flexible engagement models. High customer satisfaction has translated into long-standing relationships and preferred-partner status with many of our clients, who rely on our 30+ years of technical and domain expertise to future-proof their enterprises. Associates of Iris work on mission-critical applications supported by a workplace culture that has won numerous awards in the last few years, including Certified Great Place to Work in India; Top 25 GPW in IT & IT-BPM; Ambition Box Best Place to Work, #3 in IT/ITES; and Top Workplace NJ-USA.
Posted 4 days ago
5.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Please apply only if your notice period is less the 15 days Years of exp: 5+ Years Location: Bangalore Data Engineer Job Summary: The Data Engineer responsible for implementing and managing the operational aspects of cloud-native and hybrid data platform solutions built with Azure Databricks. They ensure the efficient and effective functioning of the Azure Databricks environment, including monitoring and troubleshooting data pipelines, managing data storage and access, and optimizing performance. They work closely with data engineers, data scientists, and other stakeholders to understand data requirements, design solutions, and implement data integration and transformation processes. Key Responsibilities: Provide expertise and ownership of Azure Databricks development tasks within the scrum team. Interact effectively with clients and leadership and can adapt communication for the appropriate audience. Read and comprehend software requirements, assisting with development of agile user stores and tasks. Assist with troubleshooting configuration and performance issues. Assist with Azure Databricks deployments, testing, configuration, and installation. Ensure security is a priority and understand the various areas where security vulnerabilities arise with database technologies. Ensure database resiliency, and disaster recovery capabilities. Required Skills & Qualifications: 5+ years proven experience working with Azure Databricks Analytics database capabilities, specifically Azure Databricks and other relational database technologies supported in Azure. 5+ years proven experience with Azure Data Lake Storage Gen 2, Azure Databricks, Azure Data Explorer, Azure Event Hubs, Spark Pools, Python, PySpark, SQL, Azure Landing Zone, Azure Networking Services, Microsoft EntraID. 5+ years proven experience with Azure geo-redundancy, HA/failover technologies. 5+ years proven experience designing and implementing data pipelines using Azure Databricks for data cleaning, transformation, and loading into Data Lakehouse. 5+ years proven experience with Infrastructure as Code (IaC) tools such as Terraform. 5+ years proven experience with programming languages such as Python, PySpark and data constructs such as JSON or XML
Posted 4 days ago
5.0 - 10.0 years
25 - 40 Lacs
Gurugram
Work from Office
Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.
Posted 4 days ago
6.0 - 8.0 years
15 - 20 Lacs
Pune, Chennai, Bengaluru
Work from Office
Position Description At CGI, were a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Big Data Developer Location: Bangalore / Hyderabad / Pune / Chennai Experience: 6- 8Years Category: Software Development/ Engineering Main location: Bangalore / Hyderabad /Pune / Chennai Employment Type: Full Time Your future duties and responsibilities • Design and develop scalable data engineering solutions using Google Cloud Platform (GCP) and PySpark. • Optimize Spark jobs for performance, scalability, and efficient resource utilization. • Develop, maintain, and enhance ETL pipelines using BigQuery, Apache Airflow, and Cloud Composer. • Collaborate with data scientists, analysts, and DevOps teams to translate business requirements into technical solutions. • Ensure data integrity and security by implementing data governance, compliance, and security best practices. • Monitor production workloads, troubleshoot performance issues, and implement enhancements. • Implement and enforce coding standards, best practices, and performance tuning strategies. • Support migration activities from on-premises data warehouses to GCP-based solutions. • Mentor junior developers and contribute to knowledge-sharing within the team. • Stay up to date with emerging cloud technologies, tools, and best practices in the data engineering ecosyste Required qualifications to be successful in this role Skills : 5 years of Experience in Big Data, Hadoop, Spark, Python, PySpark, Hive, SQL Location : Pune / Bangalore / Chennai / Hyderabad Education : BE / B.Tech / MCA / BCA
Posted 4 days ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist to join our innovative Data Science team. Reporting to the Data Science Director, you will contribute to the development of advanced Machine Learning (ML) solutions for cybersecurity challenges, including threat detection, malware analysis, and anomaly detection. Your expertise will help drive end-to-end ML product development, from data preparation to deployment, while ensuring seamless integration into our core products. What You Will Do: As a Senior Data Scientist, you will work in a team of smart data scientists reporting to the Data Science Director that does full-lifecycle full-stack Machine Learning product development, from feature engineering to model building and evaluation. Our team's use cases include but are not limited to threat detection, threat hunting, malware detection and anomaly detection, and MLOps. You will work with other Senior Data Scientists in the team to execute data science projects. You will identify issues with models running in production and resolve them. This may require retraining models from scratch, adding new features to model, set-up automated model training and deployment pipelines. These models will be integrated into popular products of the company to show maximum impact. About You: A Master Degree or Equivalent degree in Machine Learning, Computer Science, or Electrical Engineering, Mathematics, Statistics In-Depth understanding of all major Machine Learning and Deep learning algorithms, supervised and unsupervised both Passion for leveraging ML/AI to solve real-world business problems 4-7 years of industry experience in one or more machine/deep learning frameworks 4-7 years of industry experience with Python/Pyspark and SQL Experience solving multiple business problems using Machine Learning Experience with various public cloud services (such as AWS, Google, Azure) and ML automation platforms (such as MLFlow) Should be able to drive end-to-end machine learning project with limited guidance Solid computer science foundation Good written and verbal communication Ph.D in Cyber Security/Machine Learning or related field will be an added advantage 4-7 years of industry experience in the field of Data Science/Machine learning Prior experience in solving cyber security problems using machine learning Familiarity with Security Domain will be a plus Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement
Posted 4 days ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Description. 1. SQL : Proficient in database object creation including tables, views, indexes etc. Strong expertise in SQL queries ,Stored procedure & Function etc. Experienced in performance tuning & optimization techniques. 2.PowerBI : Proficiency in Power BI development, including report and dashboard creation Design, develop, and maintain complex Power BI data models, ensuring data integrity and consistency. Comprehensive understanding of data modeling and data visualization concepts Identify and resolve performance bottlenecks in Power BI reports and data models. Experience with Power Query & DAX 3. Problem-Solving Skills: Strong analytical and problem-solving skills to identify and resolve data-related issues. 4.Python : Strong proficiency in Python programming. 5.PySpark: Extensive experience with PySpark, including DataFrames & SparkSQL.
Posted 4 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the P&C Finance team. Roles And Responsibilities We're looking for someone who enjoys working with data and is comfortable wearing multiple hats — from working in raw messy files to shaping dashboards people can actually use. Here's what the day-to-day may look like: Collaborating with finance and non-finance teams to retrieve data timely across systems and divisions. Designing and building robust data pipelines in Palantir Foundry — working with large datasets and Foundry Ontology models. Using PySpark and Apache-based logic to enrich, align, and transform raw data from multiple source systems into streamlined data models. Supporting and enhancing existing reporting platforms, particularly in Power BI, by updating datasets, fixing DAX, or adjusting visuals as per stakeholder needs. Building new reporting tools or dashboards that help visualize financial and operational data clearly and efficiently. Constantly looking for ways to automate manual reporting tasks — whether via data flows, transformation logic, or re-usable queries. Working closely with stakeholders (finance, ops, and others) to understand their problems, resolve data queries, and offer practical, scalable solutions. Taking ownership of reporting problems with a solution-first mindset — if something breaks, you're the type who dives in to figure out why and how to fix it. About You You don’t need years of experience — but you do need curiosity, ownership, and a willingness to learn fast. This could be a perfect fit if: You’re a fresher or someone with 6 months to 1 year experience, ideally in a data, analytics, or reporting-heavy role. You’re comfortable with SQL and Python, and you've built things using Advanced Excel, Power Query, or Power BI. You’ve written some DAX logic, or are excited to learn more about how to shape metrics and KPIs. You like working with big messy datasets and finding ways to clean and align them so others can use them with confidence. You’re comfortable talking to business users, not just writing code — and can explain your logic without needing to “sound technical”. Maybe you’ve worked on a college or internship project where you pulled together data from different places and made something useful. That’s great. Prior experience with Palantir Foundry, or working with finance data, is a big plus We're more interested in how you think and solve problems than just checking boxes. So if you're eager to learn, open to feedback, and enjoy finding insights in data — we’d love to hear from you. Nice to Have (but not mandatory) These Aren’t Must-haves, But If You’ve Worked On Any Of The Following, It’ll Definitely Make You Stand Out You've written user-defined functions (UDFs) in PySpark to make your transformation logic reusable and cleaner across multiple pipelines. You try to follow systematic coding practices — like organizing logic into steps, adding meaningful comments, or handling edge cases cleanly. You’ve worked with version control (Git or similar), and understand how to manage updates to code or revert changes if something breaks. You care about performance optimization — like reducing pipeline runtime, minimizing joins, or improving how fast visuals load in tools like Power BI. You’re comfortable thinking not just about “how to get it to work” but also “how to make it better, faster, and easier to maintain.” About Swiss Re Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords Reference Code: 134825
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Managing Consultant, Strategy & Transformation Overview Associate Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Manage deliverable development and workstreams on projects across a range of industries and problem statements Contribute to and/or develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Manage working relationship with client managers, and act as trusted and reliable partner Create predictive models using segmentation and regression techniques to drive profits Review analytics end-products to ensure accuracy, quality and timeliness. Proactively seek new knowledge and structures project work to facilitate the capture of Intellectual Capital with minimal oversight Team Collaboration & Culture Develop sound business recommendations and deliver effective client presentations Plan, organize, and structure own work and that of junior project delivery consultants to identify effective analysis structures to address client problems and synthesize analyses into relevant findings Lead team and external meetings, and lead or co-lead project management Contribute to the firm's intellectual capital and solution development Grow from coaching to enable ownership of day-to-day project management across client projects, and mentor junior consultants Develop effective working relationships with local and global teams including business partners Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Knowledge of metrics, measurements, and benchmarking to complex and demanding solutions across multiple industry verticals Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS) and building, managing, and maintaining database structures Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience working with Hadoop framework and coding using Impala, Hive, or PySpark or working with data visualization tools (e.g., Tableau, Power BI) Experience managing tasks or workstreams in a collaborative team environment Experience coaching junior delivery consultants Relevant industry expertise MBA or master’s degree with relevant specialization (not required) Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Managing Consultant, Advisors & Consulting Services, Performance Analytics Associate Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Manage deliverable development and workstreams on projects across a range of industries and problem statements Contribute to and/or develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Manage working relationship with client managers, and act as trusted and reliable partner Create predictive models using segmentation and regression techniques to drive profits Review analytics end-products to ensure accuracy, quality and timeliness. Proactively seek new knowledge and structures project work to facilitate the capture of Intellectual Capital with minimal oversight Team Collaboration & Culture Develop sound business recommendations and deliver effective client presentations Plan, organize, and structure own work and that of junior project delivery consultants to identify effective analysis structures to address client problems and synthesize analyses into relevant findings Lead team and external meetings, and lead or co-lead project management Contribute to the firm's intellectual capital and solution development Grow from coaching to enable ownership of day-to-day project management across client projects, and mentor junior consultants Develop effective working relationships with local and global teams including business partners Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Knowledge of metrics, measurements, and benchmarking to complex and demanding solutions across multiple industry verticals Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS) and building, managing, and maintaining database structures Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience working with Hadoop framework and coding using Impala, Hive, or PySpark or working with data visualization tools (e.g., Tableau, Power BI) Experience managing tasks or workstreams in a collaborative team environment Experience coaching junior delivery consultants Relevant industry expertise MBA or master’s degree with relevant specialization (not required) Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior, Data Engineer Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview Ethoca, a Mastercard Company is seeking a Senior Data Engineer to join our team in Pune, India to drive data enablement and explore big data solutions within our technology landscape. The role is visible and critical as part of a high performing team – it will appeal to you if you have an effective combination of domain knowledge, relevant experience and the ability to execute on the details. You will bring cutting edge software and full stack development skills with advanced knowledge of cloud and data lake experience while working with massive data volumes. You will own this – our teams are small, agile and focused on the needs of the high growth fintech marketplace. You will be working across functional teams within Ethoca and Mastercard to deliver on cloud strategy. We are committed in making our systems resilient and responsive yet easily maintainable on cloud. Key Responsibilities Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark. Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design. Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows. Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments. Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability. Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle. Build scalable and reusable data models that support business analytics and dashboarding in Power BI. Develop and support real-time data streaming pipelines (e.g., using Kafka, Spark Structured Streaming) for near-instant data availability. Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform. Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans. Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions. Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team. Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues. Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams. Required Qualifications Tenured in the fields of Computer Science/Engineering or Software Engineering. Bachelor's degree in computer science, or a related technical field including programming. Deep hands-on experience with Snowflake (including administration), Snowpark, and Python. Strong background in PySpark and distributed data processing. Proven track record using dbt for building robust, testable data transformation workflows following TDD. Familiarity with Schema Change for Snowflake object deployment and version control. Good to have familiarity with Java JDK 8 or greater and exposure to Spring & Springboot framework. Good to have understanding and knowledge on Databricks Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps. Experience with real-time data processing and streaming pipelines. Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP). Skilled in developing insightful dashboards and scalable data models using Power BI. Expert in SQL development and performance optimization. Demonstrated success in building and maintaining data observability tools and frameworks. Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations. Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders. Ideally you have experience in banking, e-commerce, credit cards or payment processing and exposure to both SaaS and premises-based architectures. In addition, you have a post-secondary degree in computer science, mathematics, or quantitative science. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Hi All, Greeting! We have hirings at Gurugram Location for following role: Hands on in SQL and its Big Data variants (Hive-QL, Snowflake ANSI, Redshift SQL) Python and Spark and one or more of its API (PySpark, Spark SQL, Scala), Bash/Shell scripting Experience with Source code control - GitHub, VSTS etc. Knowledge and exposure to Big Data technologies Hadoop stack such as HDFS, Hive, Impala, Spark etc, and cloud Big Data warehouses - RedShift, Snowflake etc. Experience with UNIX command-line tools. Exposure to AWS technologies including EMR, Glue, Athena, Data Pipeline, Lambda, etc Understanding and ability to translate/physicalise Data Models (Star Schema, Data Vault 2.0 etc) Design, develop, test, deploy, maintain and improve software Develop flowcharts, layouts and documentation to identify requirements & solutions Skill: AWS+ SQL+ Python is Mandatory Experience- 4 to 12 years NOTE: _____Face to face______ interview Happening In Gurugram office on 2nd Augt- 2025--- <<<<<<>>>>>> NOTE: WE NEED PEOPLE WHO CAN JOIN BY September MAX. Apply at rashwinder.kaur@qmail.quesscorp.com
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Managing Consultant-2 Associate Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Manage deliverable development and workstreams on projects across a range of industries and problem statements Contribute to and/or develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Manage working relationship with client managers, and act as trusted and reliable partner Create predictive models using segmentation and regression techniques to drive profits Review analytics end-products to ensure accuracy, quality and timeliness. Proactively seek new knowledge and structures project work to facilitate the capture of Intellectual Capital with minimal oversight Team Collaboration & Culture Develop sound business recommendations and deliver effective client presentations Plan, organize, and structure own work and that of junior project delivery consultants to identify effective analysis structures to address client problems and synthesize analyses into relevant findings Lead team and external meetings, and lead or co-lead project management Contribute to the firm's intellectual capital and solution development Grow from coaching to enable ownership of day-to-day project management across client projects, and mentor junior consultants Develop effective working relationships with local and global teams including business partners Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Knowledge of metrics, measurements, and benchmarking to complex and demanding solutions across multiple industry verticals Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS) and building, managing, and maintaining database structures Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience working with Hadoop framework and coding using Impala, Hive, or PySpark or working with data visualization tools (e.g., Tableau, Power BI) Experience managing tasks or workstreams in a collaborative team environment Experience coaching junior delivery consultants Relevant industry expertise MBA or master’s degree with relevant specialization (not required) Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Associate Managing Consultant, Advisors Client Services, Performance Analytics Associate Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Manage deliverable development and workstreams on projects across a range of industries and problem statements Contribute to and/or develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Manage working relationship with client managers, and act as trusted and reliable partner Team Collaboration & Culture Develop sound business recommendations and deliver effective client presentations Plan, organize, and structure own work and that of junior project delivery consultants to identify effective analysis structures to address client problems and synthesize analyses into relevant findings Lead team and external meetings, and lead or co-lead project management Contribute to the firm's intellectual capital and solution development Grow from coaching to enable ownership of day-to-day project management across client projects, and mentor junior consultants Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS) and building, managing, and maintaining database structures Strong experience in authorization, fraud and/or experience in credit risk Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience working with Hadoop framework and coding using Impala, Hive, or PySpark or working with data visualization tools (e.g., Tableau, Power BI) Experience managing tasks or workstreams in a collaborative team environment Experience coaching junior delivery consultants Relevant industry expertise MBA or master’s degree with relevant specialization (not required) Our Purpose We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation, and delivers better business results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Managing Consultant – Performance Analytics Managing Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Lead client engagements across a range of industries and problem statements Develop analytics strategies and programs for large, regional, and global clients by leveraging data and technology solutions to unlock client value Own key relationships with mid-level to senior client stakeholders and independently assess client agenda, internal culture, and change readiness Team Collaboration & Culture Lead team to creative insights and sound business recommendations, and deliver impactful client presentations while growing team members’ roles and skills Provide analytical and day-to-day project delivery team leadership, and create a collaborative and inclusive environment for all levels Collaborate with internal Mastercard stakeholders including Product and Business Development to scope projects, create relevant solutions for clients, and build the firm's intellectual capital Provide on-the-job training, coaching, and mentorship to junior consultants Qualifications 8-10 years of overall career experience post Masters/MBA or 10-12 years of experience post Graduation Excellent expertise on Performance Analytics, Python, PySpark & SQL Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience coaching and managing teams across multiple projects Experience managing key client relationships Knowledge of business KPIs, financials and organizational leadership Ability to identify new business development opportunities, and experience drafting proposals and scoping new opportunities Analytical, interpretive, and problem-solving skills, including the proven ability to analyze large amounts of data and synthesize key findings and recommendations Data and analytics experience such as working with data analytics software (e.g., Python, R, SQL, SAS), building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI) Advanced Word, Excel, and PowerPoint skills Ability to manage multiple tasks and clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in Hadoop framework and coding using Impala, Hive, or PySpark Experience generating new knowledge or creating innovative solutions for a firm Relevant industry expertise Master’s degree with relevant specialization such as advanced analytics, big data, or mathematical discipline (not required) Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
GCP Data Engineer Job Location : CHENNAI Experience: 5 to 7 years Open Positions:2 Job Description:Role: GCP Data Engineer (Mid Level) Position Overview: We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL , Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles . Qualifications: 4+ years of experience in ETL & Data Warehousing Should have excellent leadership & communication skills Should have experience in developing Data Engineering solutions Airflow , GCP BigQuery , Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run , etc. ? Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid level experience in Pyspark and Teradata Should Have mid level experience in Should have working experience on any DevOps tools like GitHub, Jenkins , Cloud Native , etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. Certifications : Google Cloud Professional Data Engineer Certification. Roles & Responsibilities : Responsibilities : Analyze the different source systems , profile data , understand, document & fix Data Quality issues Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration ? 2 Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code and deploy across environments Support team during troubleshooting & debugging defects & bug fixes , business requests, environment migrations & other adhoc requests ? ? Do production support, enhancements and bug fixes Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge Perform data cleaning , transformation, and validation to ensure accuracy and consistency across various data sources · · Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough