Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Big Data Engineer (AWS-Scala Specialist) Location: Greater Noida/Hyderabad Experience: 5-10 Years About the Role- We are seeking a highly skilled Senior Big Data Engineer with deep expertise in Big Data technologies and AWS Cloud Services. The ideal candidate will bring strong hands-on experience in designing, architecting, and implementing scalable data engineering solutions while driving innovation within the team. Key Responsibilities- Design, develop, and optimize Big Data architectures leveraging AWS services for large-scale, complex data processing. Build and maintain data pipelines using Spark (Scala) for both structured and unstructured datasets. Architect and operationalize data engineering and analytics platforms (AWS preferred; Hortonworks, Cloudera, or MapR experience a plus). Implement and manage AWS services including EMR, Glue, Kinesis, DynamoDB, Athena, CloudFormation, API Gateway, and S3. Work on real-time streaming solutions using Kafka and AWS Kinesis. Support ML model operationalization on AWS (deployment, scheduling, and monitoring). Analyze source system data and data flows to ensure high-quality, reliable data delivery for business needs. Write highly efficient SQL queries and support data warehouse initiatives using Apache NiFi, Airflow, and Kylo. Collaborate with cross-functional teams to provide technical leadership, mentor team members, and strengthen the data engineering capability. Troubleshoot and resolve complex technical issues, ensuring scalability, performance, and security of data solutions. Mandatory Skills & Qualifications- ✅ 5+ years of solid hands-on experience in Big Data Technologies (AWS, Scala, Hadoop and Spark Mandatory) ✅ Proven expertise in Spark with Scala ✅ Hands-on experience with: AWS services (EMR, Glue, Lambda, S3, CloudFormation, API Gateway, Athena, Lake Formation) Share your resume at Aarushi.Shukla@coforge.com if you have experience with mandatory skills and you are an early.
Posted 5 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Manager Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Targeting Analytics Program Within the D&S Technology Team, the Targeting Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are seeking an innovative Lead Software Engineer to lead our team in designing and building a full stack web application and data pipelines. The goal is to deliver custom analytics efficiently, leveraging machine learning and AI solutions. This individual will thrive in a fast-paced, agile environment and partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Mastercard data. Help found a new, fast-growing engineering team! Position Responsibilities As a Lead Software Engineer, you will: Lead the scoping, design and implementation of complex features Lead and push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 10+ years of engineering experience in an agile production environment. Experience leading the design and implementation of complex features in full-stack applications. Proficiency with object-oriented languages, preferably Java/ Spring. Proficiency with modern front-end frameworks, preferably React with Redux, Typescript. High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop) Fluent in the use of Git, Jenkins. Solid experience with RESTful APIs and JSON/SOAP based API Solid experience with SQL, Multi-threading, Message Queuing. Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases. Desirable Capabilities Hands on experience of cloud native development using microservices. Hands on experience on Kafka, Zookeeper. Knowledge of Security concepts and protocol in enterprise application. Expertise with automated E2E and unit testing frameworks. Knowledge of Splunk or other alerting and monitoring solutions. Core Competencies Strong technologist eager to learn new technologies and frameworks. Experience coaching and mentoring junior teammates. Customer-centric development approach Passion for analytical / quantitative problem solving Ability to identify and implement improvements to team development processes Strong collaboration skills with experience collaborating across many people, roles, and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills #AI3 Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description and Requirements "At BMC trust is not just a word - it's a way of life!" Hybrid Description and Requirements "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! The IZOT product line includes BMC’s Intelligent Z Optimization & Transformation products, which help the world’s largest companies to monitor and manage their mainframe systems. The modernization of mainframe is the beating heart of our product line, and we achieve this goal by developing products that improve the developer experience, the mainframe integration, the speed of application development, the quality of the code and the applications’ security, while reducing operational costs and risks. We acquired several companies along the way, and we continue to grow, innovate, and perfect our solutions on an ongoing basis. BMC is looking for a Product Owner to join our amazing team! The BMC AMI Cloud Analytics product can quickly transfer, transform, and integrate mainframe data so it could be shared with the organizational data lake to be used by artificial intelligence, machine learning (AI/ML) and analytics solutions. In this role, you will lead the transformation of this cutting-edge product originally developed by Model9, a startup acquired by BMC, into a solution designed to meet the rigorous demands of enterprise customers. This exciting opportunity combines innovation, scalability, and leadership, giving you a chance to shape the product’s evolution as it reaches new heights in enterprise markets. You’ll analyze business opportunities, specify and prioritize customer requirements, and guide product development teams to deliver cutting-edge solutions that resonate with global B2B customers. As a product owner, you will be or become an expert on the product, market, and related business domains. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Lead the transformation of a startup-level solution from Model9 into a robust enterprise-grade product, addressing the complex needs of global organizations. Collaborate with engineering and QA teams to ensure technical feasibility, resolve roadblocks, and deliver solutions that align with customer needs Help plan product deliveries, including documenting detailed requirements, scheduling releases, and publishing roadmaps. Maintaining a strategic backlog of prioritized features. Drive cross-functional collaboration across development, QA, product management, and support teams to ensure seamless product delivery and customer satisfaction. Distil complex business and technical requirements into clear, concise PRD's and prioritized feature backlogs. To ensure you’re set up for success, you will bring the following skillset & experience: 3+ years of software product owner experience in an enterprise/B2B software company, including experience working with global B2B customers Solid technical background (preferably previous experience as a developer or QA) Deep familiarity with public cloud services and storage services (AWS EC2/FSx/EFS/EBS/S3, RDS, Aurora, etc.,) Strong understanding of ETL/ELT solutions and data transformation techniques Knowledge of modern data Lakehouse architectures (e.g., Databricks, Snowflake). B.Sc. in a related field (preferably Software Engineering or similar) or equivalent Experience leading new products and product features through ideation, research, planning, development, go-to-market and feedback cycles Fluent English, spoken and written. Willingness to travel, typically 1-2 times a quarter Whilst these are nice to have, our team can help you develop in the following skills: Background as DBA or system engineer with hands-on experience with commercial and open-source databases like MSSQL, Oracle, PostgreSQL, etc. Knowledge / experience of agile methods (especially lean); familiarity with Aha!, Jira, Confluence. Experience with ETL/ELT tools (e.g., Apache NiFi, Qlik, Precisely, Informatica, Talend, AWS Glue, Azure Data Factory). Understanding of programming languages commonly used on z/OS, such as COBOL, PL/I, REXX, and assembler. Understanding of z/OS subsystems such as JES2/JES3, RACF, DB2, CICS, MQ, and IMS. Experien ce in Cloud-based products and technologies (containerization, serverless approaches, vendor-specific cloud services, cloud security) CA-DNP Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,790,000 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply.
Posted 5 days ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Senior - Clinical Integration Developer EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Clinical Trials Integration Developers with 5+ years of experience in software development within the life sciences domain to support the integration of Medidata’s clinical trial systems across the Client R&D environment. This role offers the chance to build robust, compliant integration solutions, contribute to the design of clinical data workflows, and ensure interoperability across critical clinical applications. You will collaborate closely with business and IT teams, playing a key role in enhancing data flow, supporting trial operations, and driving innovation in clinical research. Your Key Responsibilities Design and implement integration solutions to connect Medidata clinical trial systems with other applications within the clinical data landscape. Develop and configure system interfaces using programming languages (e.g., Java, Python, C#) or integration middleware tools (e.g., Informatica, AWS, Apache NiFi). Collaborate with clinical business stakeholders and IT teams to gather requirements, define technical specifications, and ensure interoperability. Create and maintain integration workflows and data mappings that align with clinical trial data standards (e.g., CDISC, SDTM, ADaM). Ensure all development and implementation activities comply with GxP regulations and are aligned with validation best practices. Participate in agile development processes, including sprint planning, code reviews, testing, and deployment. Troubleshoot and resolve integration-related issues, ensuring stable and accurate data flow across systems. Document integration designs, workflows, and technical procedures to support long-term maintainability. Contribute to team knowledge sharing and continuous improvement initiatives within the integration space. Skills And Attributes For Success Apply a hands-on, solution-driven approach to implement integration workflows using code or middleware tools within clinical data environments. Strong communication and problem-solving skills with the ability to collaborate effectively with both technical and clinical teams. Ability to understand and apply clinical data standards and validation requirements when developing system integrations. To qualify for the role, you must have Experience: Minimum 5 years in software development within the life sciences domain, preferably in clinical trial management systems. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Proficiency in programming languages such as Java, Python, or C#, and experience with integration middleware like Informatica, AWS, or Apache NiFi; strong background in API-based system integration. Domain Knowledge: Solid understanding of clinical trial data standards (e.g., CDISC, SDTM, ADaM) and data management processes; experience with agile methodologies and GxP-compliant development environments. Soft Skills: Strong problem-solving abilities, clear communication, and the ability to work collaboratively with clinical and technical stakeholders. Additional Attributes: Capable of implementing integration workflows and mappings, with attention to detail and a focus on delivering compliant and scalable solutions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks relevant to clinical research. Hands-on experience with clinical R&D platforms such as Oracle Clinical, Medidata RAVE, or other EDC systems. Proven experience leading small integration teams and engaging with cross-functional stakeholders in regulated (GxP) environments. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Developer - Java and React/Angular, SQL, API Mastercard is a global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Your primary responsibilities would include designing, developing, and maintaining software applications using Java and related technologies. In addition to your Java development skills, having expertise in React or Angular would be beneficial in building modern and dynamic user interfaces for web applications. This role requires strong skills in HTML, CSS, and JavaScript as well as experience working with libraries and frameworks like Angular or React. Other important skills for an SSE with full stack development experience may include: Knowledge of software design patterns and best practices Experience of working on Unix environment Proficiency in database technologies such as SQL and NoSQL Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Experience with RESTful web services and API design Experience in full-stack Java development, along with proficiency in Angular or React, would be a valuable asset to this team. Knowledge of Redis will be an added advantage Experience of working on Nifi will be an added advantage Experience of working with APIs will be an added advantage Experience of working in Agile teams Experience in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment will be an added advantage Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist, Product Data & Analytics Senior Data Scientist, Product Data & Analytics Our Vision: Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. We are a hands-on global team providing scalable end-to-end data solutions by working closely with the business. We influence decisions across Mastercard through data driven insights. We are a team on analytics engineers, data architects, BI developers, data analysts and data scientists, and fully manage our own data assets and solutions. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across the continents? Are you interested in proactively looking to improve data driven decisions for a global corporation? Role Responsible for developing data-driven innovative scalable analytical solutions and identifying opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations / decisions. Accountable for delivering high quality project solutions and tools within agreed upon timelines and budget parameters and conducting post- implementation reviews. Contributes to the development of custom analyses and solutions, derives insights from extracted data to solve critical business questions. Activities include developing and creating predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Able to develop AI/ML capabilities, as needed on large volumes of data to support analytics and reporting needs across products, markets and services. Able to build end to end reusable, multi-purpose AI models to drive automated insights and recommendations. Leverage open and closed source technologies to solve business problems. Work closely with global & regional teams to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets, and services. Support initiatives in developing predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Translates client/ stakeholder needs into technical analyses and/or custom solutions in collaboration with internal and external partners, derive insights and present findings and outcomes to clients/stakeholders to solve critical business questions. Create repeatable processes to support development of modelling and reporting Delegate and reviews work for junior level colleagues to ensure downstream applications and tools are not compromised or delayed. Serves as a mentor for junior-level colleagues, and develops talent via ongoing technical training, peer review etc. All About You 6-8 years of experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack, Cloud, Snowflake, and other relevant technologies. Data visualization tools (Tableau, Domo, and/or Power BI/similar tools) experience is a plus Experience with data validation, quality control and cleansing processes to new and existing data sources. Experience on Classical and Deep Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks - Feedforward, CNN, NLP, etc. Experience on Deep Learning algorithm techniques, open-source tools and technologies, statistical tools, and programming environments such as Python, R, and Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning. Experience in automating and creating data pipeline via tools such as Alteryx, SSIS. Nifi is a plus Financial Institution or a Payments experience a plus Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills. Ownership of end-to-end Project Delivery/Risk Mitigation Virtual team management and manage stakeholders by influence Analytical/Problem Solving Able to prioritize and perform multiple tasks simultaneously Able to work across varying time zone. Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency. In depth technical knowledge, drive, and ability to learn new technologies. Must be able to interact with management, internal stakeholders Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must. Abide by Mastercard’s security policies and practices. Ensure the confidentiality and integrity of the information being accessed. Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. #AI Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Data Analytics And AI Solutions (DAAI) Program Within the D&S Technology Team, the DAAI program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative software engineering lead who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities (think prediction as a service, matching as a service or forecasting as a service) that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Ad Insights uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Help found a new, fast-growing engineering team! Position Responsibilities As a Senior Software Engineer, you will: Play a large role in scoping, design and implementation of complex features Push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 5+ years of full stack engineering experience in an agile production environment Experience leading the design and implementation of large, complex features in full-stack applications Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it Experience coaching and mentoring junior teammates Experience leading a large technical effort that spans multiple people and teams Proficiency with Java/Spring Boot, .NET/C#, SQL Server or other object-oriented languages, front-end frameworks, and/or relational database technologies Some proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms Some experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python Strong technologist with proven track record of learning new technologies and frameworks Customer-centric development approach Passion for analytical / quantitative problem solving Experience identifying and implementing technical improvements to development processes Collaboration skills with experience working with people across roles and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer "Overview Mastercard is a global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. As a Data Engineer in Data Platform and Engineering Services, you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to a number of business users who help different customers provide answers to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Develop high quality, secure and scalable data pipelines using spark, Java/Scala on object storage and Hadoop Follow MasterCard Quality Assurance and Quality Control processes Leverage new technologies and approaches to innovating with increasingly large data sets Work with project team to meet scheduled due dates, while identifying emerging issues and recommending solutions for problems Perform assigned tasks and production incident independently Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency All About You 3 to 5 years of experience in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of building data pipelines through Spark with Java/Scala on Hadoop or Object storage Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Experience of working on Nifi will be an added advantage Experience of working with APIs will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams" Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The position is for an Officer / Assistance Manager based in Mumbai. The ideal candidate should have a qualification of B.E. / MCA / B.Tech / M.sc (I.T.) and an age limit between 25-30 years. You should have a minimum of 2-3 years of ETL development experience with a strong knowledge of ETL ideas, tools, and data structures. It is essential to have the capability to analyze and troubleshoot complex data sets and determine data storage needs. Familiarity with data warehousing concepts to build a data warehouse for internal departments of the organization is required. Your responsibilities will include creating and enhancing data solutions to enable seamless delivery of data, collecting, parsing, managing, and analyzing large sets of data. You will lead the design of the logical data model, implement the physical database structure, and construct and implement operational data stores and data marts. Designing, developing, automating, and supporting complex applications to extract, transform, and load data will be part of your role. You must ensure data quality at the time of ETL, develop logical and physical data flow models for ETL applications, and have advanced knowledge of SQL, Oracle, SQOOP, NIFI tools commands, and queries. Current CTC and Expected CTC should be clearly mentioned. To apply, please email your resume to careers@cdslindia.com with the position applied for in the subject column.,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. The financial services practice at EY offers integrated advisory services to financial institutions and other capital markets participants. Within EY's Advisory Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for Senior and Manager Big Data Experts with expertise in the Financial Services domain and hands-on experience with the Big Data ecosystem. Expertise in Data engineering, including design and development of big data platforms. Deep understanding of modern data processing technology stacks such as Spark, HBase, and other Hadoop ecosystem technologies. Development using SCALA is a plus. Deep understanding of streaming data architectures and technologies for real-time and low-latency data processing. Experience with agile development methods, including core values, guiding principles, and key agile practices. Understanding of the theory and application of Continuous Integration/Delivery. Experience with NoSQL technologies and a passion for software craftsmanship. Experience in the Financial industry is a plus. Nice to have skills include understanding and familiarity with all Hadoop Ecosystem components, Hadoop Administrative Fundamentals, experience working with NoSQL in data stores like HBase, Cassandra, MongoDB, HDFS, Hive, Impala, schedulers like Airflow, Nifi, experience in Hadoop clustering, and Auto scaling. Developing standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Defining and developing client-specific best practices around data management within a Hadoop environment on Azure cloud. To qualify for the role, you must have a BE/BTech/MCA/MBA degree, a minimum of 3 years hands-on experience in one or more relevant areas, and a total of 6-10 years of industry experience. Ideally, you'll also have experience in Banking and Capital Markets domains. Skills and attributes for success include using an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates, strong communication, presentation and team building skills, experience in producing high-quality reports, papers, and presentations, and experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment, an opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide, and opportunities to work with EY Advisory practices globally with leading businesses across a range of industries. Working at EY offers inspiring and meaningful projects, education and coaching alongside practical experience for personal development, support, coaching, and feedback from engaging colleagues, opportunities to develop new skills and progress your career, freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Role: We are seeking a highly skilled and experienced Data Architect with expertise in designing and building data platforms in cloud environments. The ideal candidate will have a strong background in either AWS Data Engineering or Azure Data Engineering, along with proficiency in distributed data processing systems like Spark. Additionally, proficiency in SQL, data modeling, building data warehouses, and knowledge of ingestion tools and data governance are essential for this role. The Data Architect will also need experience with orchestration tools such as Airflow or Dagster and proficiency in Python, with knowledge of Pandas being beneficial. Why Choose Ideas2IT Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies. AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. What’s in it for you? You will get to work on impactful products instead of back-office applications for the likes of customers like Facebook, Siemens, Roche, and more You will get to work on interesting projects like the Cloud AI platform for personalized cancer treatment Opportunity to continuously learn newer technologies Freedom to bring your ideas to the table and make a difference, instead of being a small cog in a big wheel Showcase your talent in Shark Tanks and Hackathons conducted in the company Here’s what you’ll bring Experience in designing and building data platforms in any cloud. Strong expertise in either AWS Data Engineering or Azure Data Engineering Develop and optimize data processing pipelines using distributed systems like Spark. • Create and maintain data models to support efficient storage and retrieval. Build and optimize data warehouses for analytical and reporting purposes, utilizing technologies such as Postgres, Redshift, Snowflake, etc. Knowledge of ingestion tools such as Apache Kafka, Apache Nifi, AWS Glue, or Azure Data Factory. Establish and enforce data governance policies and procedures to ensure data quality and security. Utilize orchestration tools like Airflow or Dagster to schedule and manage data workflows. Develop scripts and applications in Python to automate tasks and processes. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Communicate technical solutions effectively to clients and stakeholders. Familiarity with multiple cloud ecosystems such as AWS, Azure, and Google Cloud Platform (GCP). Experience with containerization and orchestration technologies like Docker and Kubernetes. Knowledge of machine learning and data science concepts. Experience with data visualization tools such as Tableau or Power BI. Understanding of DevOps principles and practices.
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Software Engineer Senior Location: Chennai Work Type: Hybrid Position Description: This fast-paced job position is intended for people who like to build analytics platforms and tooling which deliver real value to the business. Applicants should have a strong desire to learn new technologies and be interested in providing guidance which will help drive the adoption of these tools. The Analytics Data Management (ADM) Product Engineer will assist with the engineering of strategic data management platforms from Informatica, primarily Enterprise Data Catalog and Apache NiFi. Other technologies include Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; IBM: Information Server, Cloud Pak for Data (CP4D); Google: Cloud Data Fusion. This person will also collaborate with Infrastructure Architects to design and implement environments based on these technologies for use in the client's enterprise data centers. Platforms may be based on-premises, or hosted in Google Cloud offering Skills Required: Informatica Skills Preferred: Cloud Infrastructure Experience Required: Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; Apache NiFi Experience Required Informatica Products: Installation, configuration, administration, and troubleshooting. Specific experience with Informatica Data Catalog is essential. Apache Nifi : Strong Java development experience to create custom NiFi processors and expertise in deploying and managing NiFi applications on Red Hat OS environments. Google Cloud Platform (GCP): Provisioning, administration, and troubleshooting of products. Specific experience with DataPlex or Google Cloud Data Fusion (CDF) is highly preferred. Experience Range: 5-8 years Experience Preferred: Summary of Responsibilities: Engineer, test, and modernize data management platforms primarily Informatica Enterprise Data Catalog and Apache NiFi. Enable cloud migrations for Analytics platforms Define, document, and monitor global (Follow-the-Sun) support procedures (Incident Management, Request Management, Event Management, etc). Provide Asia-Pacific (IST) 2nd level support for these products. Responsibilities Detail: Installing and configuring products, Working with platform teams support to resolve issues, Working with vendor support to resolve issues, Thoroughly testing product functionality on the platform; Developing custom installation guides, configurations, and scripts that are consistent with the client's IT security policy; Providing 2nd level support regarding product related issues; Developing new tools and processes to ensure effective implementation and use of the technologies. Implementing, monitoring/alerting, and analyzing usage data to ensure optimal performance of the infrastructure. Maintaining a SharePoint site with relevant documentation, FAQs, processes, etc. necessary to promote and support the use of these technologies. Required Skills Ability collect and clearly document requirements. Ability to prioritize work and manage multiple assignments. Ability to create & execute detailed project plans and test plans. Education Required: Bachelor's Degree Education Preferred: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Systems Engineering Senior Location: Chennai 34329 Compensation: Up to ₹17 LPA Type: Contract Notice Period: Immediate Joiners Only Position Overview This is a dynamic role ideal for individuals passionate about building scalable analytics platforms and data management tools that deliver real business value. The role focuses on engineering strategic data management platforms, particularly Informatica Enterprise Data Catalog and Apache NiFi , while collaborating with infrastructure teams to design and deploy solutions on both on-premises and cloud (GCP) environments. The selected candidate will be part of a global product engineering team, supporting enterprise-wide implementations and integration of cutting-edge data technologies. Key Responsibilities Engineer, modernize, and maintain data management platforms with a focus on Informatica Data Catalog and Apache NiFi. Enable and support cloud migration efforts for analytics infrastructure. Install, configure, and administer Informatica products, ensuring compliance with security policies and operational standards. Develop and test custom configurations, installation scripts, and environment tuning strategies. Create documentation, FAQs, and guides for platform use and issue resolution. Implement monitoring, logging, and alerting for performance optimization and proactive issue detection. Provide 2nd-level support for data platforms in the Asia-Pacific (IST) time zone. Work collaboratively with support teams and vendors to troubleshoot and resolve technical issues. Maintain and manage SharePoint or internal documentation portals related to the platform. Create project and test plans, prioritize tasks, and manage platform upgrades and enhancements. Must-Have Skills Informatica IICS/IDMC, PowerCenter, Data Catalog, and MDM installation, administration, and troubleshooting Expertise in Apache NiFi: including Java development for custom NiFi processors Experience managing NiFi on Red Hat OS environments Hands-on with GCP services, particularly Cloud Data Fusion (CDF) and DataPlex Strong understanding of platform performance tuning, monitoring, and troubleshooting Preferred Skills Familiarity with cloud infrastructure tools and services Automation experience (scripting tools to streamline workflows) Exposure to platform security: access control, encryption, and vulnerability mitigation Experience working in regulated, enterprise-scale environments Collaborative experience with cross-functional teams (data engineers, analysts, scientists) Prior experience supporting similar platforms in a follow-the-sun support model Education Required: Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related field Preferred: Master’s Degree is a plus Additional Notes Ideal for professionals with relevant 3 yrs of experience in engineering enterprise data platforms Prior experience supporting Informatica or Apache NiFi in production is essential Security, performance tuning, and automation experience highly valued Strong communication and documentation skills are essential for collaboration across teams Skills: troubleshooting,informatica data catalog,informatica iics,informatica idmc,cloud data fusion,apache nifi,informatica powercenter,platform performance tuning,dataplex,scripting tools,gcp services,management,apache,data,java development,platform security,monitoring,informatica mdm,enterprise
Posted 6 days ago
10.0 - 15.0 years
10 - 15 Lacs
Pune, Maharashtra, India
On-site
The Role We are looking for an experienced Data Engineer to design and develop advanced data migration pipelines from traditional OLTP databases (e.g., Oracle) to modern big data platforms such as Cloudera and Databricks. The ideal candidate will possess expertise in technologies such as Python, Java, Spark, and NiFi, along with a proven track record in managing data pipelines for tasks including initial snapshot loading, building Change Data Capture (CDC) pipelines, exception management, reconciliation, data security, and retention. This role also demands proficiency in data modeling, cataloging, taxonomy creation, and ensuring robust data provenance and lineage to support governance and compliance requirements. Key Responsibilities Design, develop, and optimize data migration pipelines from OLTP databases like Oracle to big data platforms, including Cloudera CDP/CDH and Databricks. Build scalable ETL workflows using tools like Python, Scala, Apache Spark, and Apache NiFi to support initial snapshots, CDC, exception handling, and reconciliation processes. Implement data security measures, such as encryption, access controls, and compliance with data retention policies, across all migration pipelines. Develop and maintain data models, taxonomy structures, and cataloging systems to ensure logical organization and easy accessibility of data. Establish data lineage and provenance to ensure traceability and compliance with governance frameworks. Collaborate with cross-functional teams to understand data migration requirements, ensuring high-quality and timely delivery of solutions. Monitor and troubleshoot data pipelines to ensure performance, scalability, and reliability. Stay updated on emerging technologies in data engineering and big data ecosystems, proposing improvements to existing systems and processes. Required Skills And Qualifications 10+ years of experience in data engineering, with at least 2 years in a leadership or technical lead role. Proficiency in OLTP databases, particularly Oracle, and data egress techniques. Strong programming skills in Python, Scala and Java. Expertise in Apache Spark, Flink, Kafka and data integration tools like Apache NiFi. Hands-on experience with Cloudera Data Platform CDP/CDH, Apache Ozone Familiarity with cloud-based big data ecosystems such as AWS Databrick, S3, Glue etc Familiarity with patterns such as Medallion, data layers, datalake, datawarehouse, experience in building scalable ETL pipeline, optimizing data workflows, leveraging platforms to integrate transform, and store large datasets. Knowledge of data security best practices, including encryption, data masking, and role-based access control. Exceptional problem-solving and analytical abilities Strong communication and leadership skills, with the ability to navigate ambiguity and collaborate effectively across diverse teams.Optional Awareness on regulatory compliance requirements for data handling and privacy Education: Bachelor's or Master's degree in Computer Science
Posted 6 days ago
9.0 - 15.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title- Snowflake Data Architect Experience- 9 to 15 Years Location- Gurugram Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures. Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion, or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake. Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema, normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake: Architecture design, performance tuning, cost optimization. Strong proficiency in SQL, Python, and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion, or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms: AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks. Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI, or Looker. Certifications (Preferred/Required): ✅ Snowflake SnowPro Core Certification – Required or Highly Preferred ✅ SnowPro Advanced Architect Certification – Preferred ✅ Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ✅ ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture
Posted 6 days ago
3.0 years
4 Lacs
Delhi
On-site
Job Description: Hadoop & ETL Developer Location: Shastri Park, Delhi Experience: 3+ years Education: B.E./ B.Tech/ MCA/ MSC (IT or CS) / MS Salary: Upto 80k (rest depends on interview and the experience) Notice Period: Immediate joiner to 20 days of joiners Candidates from Delhi/ NCR will only be preferred Job Summary:- We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience : 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: From ₹400,000.00 per year Work Location: In person
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Overview We are looking for an experienced Solution Architect AI/ML & Data Engineering to lead the design and delivery of advanced data and AI/ML solutions for our clients. Responsibilities The ideal candidate will have a strong background in end-to-end data architecture, AI lifecycle management, cloud technologies, and emerging Generative AI Responsibilities : Collaborate with clients to understand business requirements and design robust data solutions. Lead the development of end-to-end data pipelines including ingestion, storage, processing, and visualization. Architect scalable, secure, and compliant data systems following industry best practices. Guide data engineers, analysts, and cross-functional teams to ensure timely delivery of solutions. Participate in pre-sales efforts: solution design, proposal creation, and client presentations. Act as a technical liaison between clients and internal teams throughout the project lifecycle. Stay current with emerging technologies in AI/ML, data platforms, and cloud services. Foster long-term client relationships and identify opportunities for business expansion. Understand and architect across the full AI lifecyclefrom ingestion to inference and operations. Provide hands-on guidance for containerization and deployment using Kubernetes. Ensure proper implementation of data governance, modeling, and warehousing : Bachelors or masters degree in computer science, Data Science, or related field. 10+ years of experience as a Data Solution Architect or similar role. Deep technical expertise in data architecture, engineering, and AI/ML systems. Strong experience with Hadoop-based platforms, ideally Cloudera Data Platform or Data Fabric. Proven pre-sales experience: technical presentations, solutioning, and RFP support. Proficiency in cloud platforms (Azure preferred; also, AWS or GCP) and cloud-native data tools. Exposure to Generative AI frameworks and LLMs like OpenAI and Hugging Face. Experience in deploying and managing applications on Kubernetes (AKS, EKS, GKE). Familiarity with data governance, data modeling, and large-scale data warehousing. Excellent problem-solving, communication, and client-facing & Technology Architecture & Engineering: Hadoop Ecosystem: Cloudera Data Platform, Data Fabric, HDFS, Hive, Spark, HBase, Oozie. ETL & Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, AWS Glue. Warehousing: Azure Synapse, Redshift, BigQuery, Snowflake, Teradata, Vertica. Streaming: Apache Kafka, Azure Event Hubs, AWS Platforms: Azure (preferred), AWS, GCP. Data Lakes: ADLS, AWS S3, Google Cloud Platforms: Data Fabric, AI Essentials, Unified Analytics, MLDM, MLDE. AI/ML & GenAI Lifecycle Tools: MLflow, Kubeflow, Azure ML, SageMaker, Ray. Inference: TensorFlow Serving, KServe, Seldon. Generative AI: Hugging Face, LangChain, OpenAI API (GPT-4, etc. DevOps & Deployment Kubernetes: AKS, EKS, GKE, Open Source K8s, Helm. CI/CD: Jenkins, GitHub Actions, GitLab CI, Azure DevOps. (ref:hirist.tech)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are an experienced Full-Stack Developer with 5+ years of experience in building scalable web applications using Python (FastAPI), React.js, and cloud-native technologies. In this role, you will be responsible for developing a low-code/no-code AI agent platform, implementing an intuitive workflow UI, and integrating with LLMs, enterprise connectors, and role-based access controls. Your responsibilities will include backend development where you will develop and optimize APIs using FastAPI, integrating with LangChain, vector databases (Pinecone/Weaviate), and enterprise connectors (Airbyte/Nifi). Additionally, you will work on frontend development to build an interactive drag-and-drop workflow UI using React.js (React Flow, D3.js, TailwindCSS). You will also be involved in implementing OAuth2, Keycloak, and role-based access controls (RBAC) for multi-tenant environments. Database design is a crucial part of this role, where you will work with PostgreSQL (structured data), MongoDB (unstructured data), and Neo4j (knowledge graphs). DevOps & Deployment tasks will involve deploying using Docker, Kubernetes, and Terraform across multi-cloud (Azure, AWS, GCP) to ensure smooth operations. Performance optimization is another key area where you will focus on improving API performance and optimizing frontend responsiveness for seamless user experience. Collaboration with AI & Data Engineers is essential, as you will work closely with the Data Engineering team to ensure smooth AI model integration. To be successful in this role, you are required to have 5+ years of experience in FastAPI, React.js, and cloud-native applications. Strong knowledge of REST APIs, GraphQL, and WebSockets is essential, along with experience in JWT authentication, OAuth2, and multi-tenant security. Additionally, proficiency in PostgreSQL, MongoDB, Neo4j, and Redis is expected. Knowledge of workflow automation tools (n8n, Node-RED, Temporal.io), familiarity with containerization (Docker, Kubernetes), and CI/CD pipelines is also required. Bonus skills include experience in Apache Kafka, WebSockets, or AI-driven chatbots.,
Posted 1 week ago
3.0 years
0 Lacs
Delhi, Delhi
On-site
Job Description: Hadoop & ETL Developer Location: Shastri Park, Delhi Experience: 3+ years Education: B.E./ B.Tech/ MCA/ MSC (IT or CS) / MS Salary: Upto 80k (rest depends on interview and the experience) Notice Period: Immediate joiner to 20 days of joiners Candidates from Delhi/ NCR will only be preferred Job Summary:- We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience : 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: From ₹400,000.00 per year Work Location: In person
Posted 1 week ago
6.0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Business Area: Professional Services Seniority Level: Mid-Senior level Job Description: At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, we're the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world’s largest enterprises. Team Description Cloudera is seeking a Solutions Consultant to join its APAC Professional Services team. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using CDP, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to work across multiple industries and large customer organizations. As the Solution Consultant you will : Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow Design and implement CDP platform architectures and configurations for customers Perform platform installation and upgrades for advanced secured cluster configurations Analyze complex distributed production deployments, and make recommendations to optimize performance Able to document and present complex architectures for the customers technical teams Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer Drive projects with customers to successful completion Write and produce technical documentation, blogs and knowledgebase articles Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements Keep current with the Hadoop Big Data ecosystem technologies Work in different timezone We’re excited about you if you have: 6+ years in Information Technology and System Architecture experience 4+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions 3+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments. Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc. Ability to understand and translate customer requirements into technical requirements Experience implementing data transformation and processing solutions Experience designing data queries against data in the HDFS environment using tools such as Apache Hive Experience setting up multi-node Hadoop clusters Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO) Experience in Cloudera Software and/or HDP Certification (HDPCA / HDPCD) is a plus Strong experience implementing software and/or solutions in the enterprise Linux environment Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos Strong understanding of network configuration, devices, protocols, speeds and optimizations Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools Excellent verbal and written communications What you can expect from us: Generous PTO Policy Support work life balance with Unplugged Days Flexible WFH Policy Mental & Physical Wellness programs Phone and Internet Reimbursement program Access to Continued Career Development Comprehensive Benefits and Competitive Packages Paid Volunteer Time Employee Resource Groups EEO/VEVRAA
Posted 1 week ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi, Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune. We are looking for candidates with 10+ years of experience who is currently working as a Data Architect. Job Description: We are seeking a highly skilled and experienced Cloud Data Architect to design, implement, and manage scalable, secure, and efficient cloud-based data solutions. The ideal candidate will possess a strong combination of technical expertise, analytical skills, and the ability to collaborate effectively with cross-functional teams to translate business requirements into technical solutions. Key Responsibilities: Design and implement data architectures, including data pipelines, data lakes, and data warehouses, on cloud platforms. Develop and optimize data models (e.g., star schema, snowflake schema) to support business intelligence and analytics. Leverage big data technologies (e.g., Hadoop, Spark, Kafka) to process and analyze large-scale datasets. Manage and optimize relational and NoSQL databases for performance and scalability. Develop and maintain ETL/ELT workflows using tools like Apache NiFi, Talend, or Informatica. Ensure data security and compliance with regulations such as GDPR and CCPA. Automate infrastructure deployment using CI/CD pipelines and Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Collaborate with analytics teams to integrate machine learning frameworks and visualization tools (e.g., Tableau, Power BI). Provide technical leadership and mentorship to team members. Interested candidates for above position kindly share your CVs on sneh.ne@peoplefy.com
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
As a seasoned Manager - Data Engineering with 12-18 years of total experience in data engineering, including 3-5 years in a leadership/managerial role, you will lead complex data platform implementations using Databricks or the Apache data stack. Your key responsibilities will include leading high-impact data engineering engagements for global clients, delivering scalable solutions, and driving digital transformation. You must have hands-on experience in Databricks OR core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.) and expertise in one or more cloud platforms such as AWS, Azure, or GCP, ideally with Databricks on cloud. Strong programming skills in Python, Scala, and SQL are essential, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also required. Your role will involve leading the architecture, development, and deployment of modern data platforms using Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks. Additionally, you will own delivery accountability for data engineering programs across BFSI, telecom, healthcare, or manufacturing clients. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key part of your responsibilities. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance is crucial. You will manage and mentor a team of 10-25 engineers, conducting performance reviews, capability building, and coaching. Supporting presales activities including solutioning, technical proposals, and client workshops will also be part of your role. At GlobalLogic, we prioritize a culture of caring and continuous learning and development. You'll have the opportunity to work on interesting and meaningful projects that have a real impact. We offer balance and flexibility, ensuring that you can achieve the perfect equilibrium between work and life. As a high-trust organization, integrity is key, and you can trust that you are part of a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. As part of our team, you'll collaborate with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for an experienced Manager Data Engineering with expertise in Databricks or the Apache data stack to lead complex data platform implementations. As the Manager Data Engineering, you will play a crucial role in spearheading high-impact data engineering projects for global clients, delivering scalable solutions, and catalyzing digital transformation. You should have a total of 12-18 years of experience in data engineering, with at least 3-5 years in a leadership or managerial capacity. Hands-on experience in Databricks or core Apache stack components such as Spark, Kafka, Hive, Airflow, NiFi, etc., is essential. Proficiency in one or more cloud platforms like AWS, Azure, or GCP is preferred, ideally with Databricks on the cloud. Strong programming skills in Python, Scala, and SQL are required, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is advantageous. Your responsibilities will include leading the architecture, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks, ensuring delivery accountability for data engineering programs across various industries. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key aspect of your role. Additionally, you will be responsible for ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance. Managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will have opportunities for continuous learning and development, engaging in interesting and meaningful work that makes an impact. We believe in providing balance and flexibility to help you integrate your work and life effectively. GlobalLogic is a high-trust organization built on integrity and ethical values, providing a safe and reliable environment for your professional growth and success. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with leading companies worldwide to create innovative digital products and experiences. Join us to be a part of transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist, Product Data & Analytics Senior Data Scientist, Product Data & Analytics Our Vision: Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. We are a hands-on global team providing scalable end-to-end data solutions by working closely with the business. We influence decisions across Mastercard through data driven insights. We are a team on analytics engineers, data architects, BI developers, data analysts and data scientists, and fully manage our own data assets and solutions. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across the continents? Are you interested in proactively looking to improve data driven decisions for a global corporation? Role Responsible for developing data-driven innovative scalable analytical solutions and identifying opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations / decisions. Accountable for delivering high quality project solutions and tools within agreed upon timelines and budget parameters and conducting post- implementation reviews. Contributes to the development of custom analyses and solutions, derives insights from extracted data to solve critical business questions. Activities include developing and creating predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Able to develop AI/ML capabilities, as needed on large volumes of data to support analytics and reporting needs across products, markets and services. Able to build end to end reusable, multi-purpose AI models to drive automated insights and recommendations. Leverage open and closed source technologies to solve business problems. Work closely with global & regional teams to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets, and services. Support initiatives in developing predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Translates client/ stakeholder needs into technical analyses and/or custom solutions in collaboration with internal and external partners, derive insights and present findings and outcomes to clients/stakeholders to solve critical business questions. Create repeatable processes to support development of modelling and reporting Delegate and reviews work for junior level colleagues to ensure downstream applications and tools are not compromised or delayed. Serves as a mentor for junior-level colleagues, and develops talent via ongoing technical training, peer review etc. All About You 6-8 years of experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack, Cloud, Snowflake, and other relevant technologies. Data visualization tools (Tableau, Domo, and/or Power BI/similar tools) experience is a plus Experience with data validation, quality control and cleansing processes to new and existing data sources. Experience on Classical and Deep Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks - Feedforward, CNN, NLP, etc. Experience on Deep Learning algorithm techniques, open-source tools and technologies, statistical tools, and programming environments such as Python, R, and Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning. Experience in automating and creating data pipeline via tools such as Alteryx, SSIS. Nifi is a plus Financial Institution or a Payments experience a plus Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills. Ownership of end-to-end Project Delivery/Risk Mitigation Virtual team management and manage stakeholders by influence Analytical/Problem Solving Able to prioritize and perform multiple tasks simultaneously Able to work across varying time zone. Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency. In depth technical knowledge, drive, and ability to learn new technologies. Must be able to interact with management, internal stakeholders Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must. Abide by Mastercard’s security policies and practices. Ensure the confidentiality and integrity of the information being accessed. Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. #AI Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-244065
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough