About NetApp
NetApp is the intelligent data infrastructure company, turning a world of disruption into opportunity for every customer. No matter the data type, workload or environment, we help our customers identify and realize new business possibilities. And it all starts with our people.If this sounds like something you want to be part of, NetApp is the place for you. You can help bring new ideas to life, approaching each challenge with fresh eyes. Of course, you won't be doing it alone. At NetApp, we're all about asking for help when we need it, collaborating with others, and partnering across the organization - and beyond.
Job Summary
As a Data Lake Engineer at NetApp India’s R&D division, you will be responsible for the design, development, and support of multi petabytes scale Datalakes for both on-prem and soon to be cloud environments. You will be part of a highly skilled technical team named NetApp Active IQ.The Active IQ DataHub platform processes over 10 trillion data points per month that feeds a multi-Petabyte DataLake. The platform is built using Kafka, a serverless platform running on Kubernetes, Spark and various NoSQL databases. This platform enables the use of advanced AI and ML techniques to uncover opportunities to proactively protect and optimize NetApp storage, and then provides the insights and actions to make it happen. We call this “actionable intelligence”You will be working closely with a team of Architect, Product Management, and a Technical Director. You will be responsible for contributing to the architecture, design, and development and testing of the Data Lake Solution. The Data Lake Solution you will be responsible for will be used by our internal product teams and other internal NetApp organizations.We are looking for a Data Lake Engineer who is familiar with Cloud data platforms (preferably datalake), Data Modeling and Python.Job Requirements
- Develop Modules to implement multiple petabytes scale datalake
 - Be the first line of defence in case of any issue in datalake.
 - Develop best practice sample deployments for use cases (to guide end users).
 - Assist users in developing their own queries/jobs to address their use case
 - Hands-on experience working on Cloud data platforms (preferably datalake)
 - Hands-on coding experience (preferably python).
 - Understanding of SQL & No-SQL Systems.
 - Strong data analysis & analytical problem-solving skills.
 - Ability to work successfully in a global team environment.
 - Good to Have: Experience with Atlassian toolkits (Jira, Bitbucket etc).
 
At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees. This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process.
Equal Opportunity Employer:
NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification.
Why NetApp?
We are all about helping customers turn challenges into business opportunity. It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better - but also to innovate. We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches.We enable a healthy work-life balance. Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations. We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future. We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life.If you want to help us build knowledge and solve big problems, let's talk.