10 - 15 years
20 - 25 Lacs
Posted:1 week ago|
Platform:
Work from Office
Full Time
Lead Software Engineer - Apache Nifi, Python, PySpark, Hadoop, Cloudera platforms, and Airflow
Title and Summary
Lead Software Engineer - Apache Nifi, Python, PySpark, Hadoop, Cloudera platforms, and AirflowWho is Mastercard?Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential.Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all.OverviewMastercard is looking for a Lead Software Engineer to join the Account Level Management(ALM) team in our Pune office thats focused on building ALM Services with Data warehouse skills. The Mastercard Account Level Management platform empowers real-time card level decisioning. As consumers progress along their life stages as card holders, with increasing disposable income and more refined preferences, ALM provides services to issuers so they can effectively offer more relevant benefits and rewards at each stage, to drive loyalty and spend. Roleo Develop high quality, secure and scalable data pipelines using spark, Scala/Python/Java on Hadoop or object storage like MinIO.o Leverage technologies and solutions to innovate with increasingly large data sets.o Drive automation and efficiency in Data ingestion, data movement and data access workflows by innovation and collaboration.o Understand, implement and enforce Software development standards and engineering principles in the Big Data space.o Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency.o Perform assigned tasks and production incident independently.Skill Requirements:MUST Technical10+ years of experience in Data Warehouse/Data Lake/Lake House related projects in product or service-based organizationSolid Experience of building complex data pipelines through PySpark with Scala/Python on Hadoop or Object storageExperience of building Nifi pipelines (Preferred).Proficiency in Cloudera platforms, and AirflowExpertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment handling petabyte scale data.Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge.Proficient in working within an Agile/Scrum framework, including creating user stories with well-defined acceptance criteria, participating in sprint planning and reviewsWrite and maintain Unix shell scripts, Oracle SQL, PL/SQL, and perform SQL tuning.Optimize and troubleshoot Spark applications for performance, scalability, and fault toleranceUse Git-based version control systems and CI/CD pipelines (e.g., Jenkins).Implement and manage HIVE external tables, partitions, and various file formats.Work across on-premises and cloud environments (AWS, Azure, Databricks).Strong experience with Hadoop ecosystem and Cloudera Data Platform (CDP).Optional TechnicalStrong analytical skills required for debugging production issues, providing root cause and implementing mitigation planStrong communication skills - both verbal and written Ability to multi-task across multiple projects, interface with external internal resourcesProactive, detail-oriented and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Willingness to quickly learn and implement new technologies, participate POC to explore best solution for the problem statementExperience working diverse and geographically distributed project teamsEducation:Bachelors degree in information systems, Information Technology, Computer Science or Engineering or equivalent work experience.Corporate Security Responsibility
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:
Mastercard
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now20.0 - 25.0 Lacs P.A.
bhubaneswar, odisha, india
Salary: Not disclosed
bhubaneswar, odisha, india
Salary: Not disclosed
bhubaneswar, odisha, india
Salary: Not disclosed
coimbatore, tamil nadu, india
Salary: Not disclosed
bhubaneswar, odisha, india
Salary: Not disclosed
indore, madhya pradesh
Salary: Not disclosed
bengaluru
16.0 - 18.0 Lacs P.A.
bengaluru
16.0 - 18.0 Lacs P.A.
bengaluru
16.0 - 18.0 Lacs P.A.