Data Management Lead

4 - 9 years

11 - 15 Lacs

Kochi

Posted:4 weeks ago| Platform: Naukri logo

Apply

Skills Required

Java Azure Scala PostgreSQL Cassandra Hadoop Big Data Kafka Flink SQL Apache Spark Jenkins Data Science Hive MySQL Agile CI/CD MongoDB Scrum AWS GitLab Python

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are looking for a highly skilled and experienced Data Management Lead (Architect) with 4 to 9 years of experience to design, implement, and manage data lake environments. The ideal candidate will have a strong background in data management, architecture, and analytics. ### Roles and Responsibility Design and implement scalable, secure, and high-performing data lake architectures. Select appropriate technologies and platforms for data storage, processing, and analytics. Define and enforce data governance, metadata management, and data quality standards. Collaborate with IT security teams to establish robust security measures. Develop and maintain data ingestion and integration processes from various sources. Provide architectural guidance and support to data scientists and analysts. Monitor the performance of the data lake and recommend improvements. Stay updated on industry trends and advancements in data lake technologies. Liaise with business stakeholders to understand their data needs and translate requirements into technical specifications. Create documentation and architectural diagrams to provide a clear understanding of the data lake structure and processes. Lead the evaluation and selection of third-party tools and services to enhance the data lake's capabilities. Mentor and provide technical leadership to the data engineering team. Manage the full lifecycle of the data lake, including capacity planning, cost management, and decommissioning of legacy systems. ### Job Requirements At least 4 years of hands-on experience in designing, implementing, and managing data lakes or large-scale data warehousing solutions. Proficiency with data lake technologies such as Hadoop, Apache Spark, Apache Hive, or Azure Data Lake Storage. Experience with cloud services like AWS (Amazon Web Services), Microsoft Azure, or Google Cloud Platform, especially with their data storage and analytics offerings. Knowledge of SQL and NoSQL database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Expertise in data modeling techniques and tools for both structured and unstructured data. Experience with ETL (Extract, Transform, Load) tools and processes, and understanding of data integration and transformation best practices. Proficiency in programming languages commonly used for data processing and analytics, such as Python, Scala, or Java. Familiarity with data governance frameworks and data quality management practices to ensure the integrity and security of data within the lake. Knowledge of data security principles, including encryption, access controls, and compliance with data protection regulations (e.g., GDPR, HIPAA). Experience with big data processing frameworks and systems, such as Apache Kafka for real-time data streaming and Apache Flink or Apache Storm for stream processing. Familiarity with data pipeline orchestration tools like Apache Airflow, Luigi, or AWS Data Pipeline. Understanding of DevOps practices, including continuous integration/continuous deployment (CI/CD) pipelines, and automation tools like Jenkins or GitLab CI. Skills in monitoring data lake performance, diagnosing issues, and optimizing storage and processing for efficiency and cost-effectiveness. Ability to manage projects, including planning, execution, monitoring, and closing, often using methodologies like Agile or Scrum. Self-starter, independent-thinker, curious and creative person with ambition and passion. Bachelor's Degree: A bachelor's degree in Computer Science, Information Technology, Data Science, or a related field is typically required. This foundational education provides the theoretical knowledge necessary for understanding complex data systems. Master's Degree (optional): A master's degree or higher in a relevant field such as Computer Science, Data Science, or Information Systems can be beneficial. It indicates advanced knowledge and may be preferred for more senior positions. Certifications (optional): Industry-recognized certifications can enhance a candidate's qualifications. Examples include AWS Certified Solutions Architect, Azure Data Engineer Associate, Google Professional Data Engineer, Cloudera Certified Professional (CCP), or certifications in specific technologies like Apache Hadoop or Spark. PowerBI or any other reporting platform experience is a must. Knowledge on Power Automate, Qlik View, or any other reporting platform is an added advantage. ITIL Foundation certification is preferred.

Mock Interview

Practice Video Interview with JobPe AI

Start Java Interview Now
EY
EY

Professional Services

London

300,000+ Employees

7902 Jobs

    Key People

  • Carmine Di Sibio

    Global Chairman and CEO
  • Kate Barton

    Global Vice Chair, Tax

RecommendedJobs for You