About The Role
The Data Architect plays a critical role in designing, implementing, and managing the enterprise data architecture for ELGi, ensuring that data platforms, systems, and solutions are scalable, efficient, and aligned with business needs. 
Role & responsibilities
 
Data Architecture Design and Strategy
-  Define and implement the enterprise data architecture strategy, ensuring alignment with business and IT objectives. 
 -  Design scalable, secure, and cost-effective data platforms, including data lakes, data warehouses, and real-time streaming solutions. 
 -  Develop data models and integration frameworks to support analytics, reporting, and business intelligence (BI) initiatives. 
 -  Drive the adoption of cloud-based data platforms (e.g., AWS, Azure, GCP) and modern data technologies to enable enterprise-wide analytics. 
 
Data Governance and Standards
-  Establish and enforce data architecture standards, frameworks, and best practices. 
 -  Collaborate with data governance teams to ensure high data quality, consistency, security, and compliance (e.g., DPDP, GDPR, CCPA). 
 -  Oversee metadata management, lineage tracking, and the development of data dictionaries.
 
Collaboration and Solution Delivery
-  Partner with business stakeholders, data scientists, and IT teams to understand requirements and deliver fit-for-purpose data solutions. 
 -  Support the design and implementation of data pipelines for ingesting, processing, and transforming structured and unstructured data. 
 -  Act as a subject matter expert for data architecture across key IT and business initiatives, ensuring seamless integration and performance. 
 
Innovation and Emerging Technologies
-  Evaluate and recommend emerging data technologies, tools, and frameworks (e.g., AI/ML, IoT data integration, edge computing) to drive innovation. 
 -  Lead proof-of-concept (PoC) initiatives to explore advanced data solutions and improve existing architecture.
 -  Incorporate sustainability (Green IT) principles into data architecture design. 
 
Team Leadership and Mentoring
-  Provide technical leadership and mentorship to a team of data engineers and analysts.
 -  Work closely with cross-functional teams to enable data-driven decision-making through modern, sustainable data strategies and technologies. 
 - Foster a culture of continuous improvement, collaboration, and technical excellence. 
 
Preferred candidate profile
Skills
-  Proficiency in designing and implementing enterprise data architectures for large, complex   organizations.
 -  Strong hands-on experience with cloud platforms (e.g., AWS Redshift, Azure Synapse, Google   BigQuery), big data tools (e.g., Spark, Hadoop), and relational and NoSQL   databases.
 -  Expertise in data modeling (conceptual, logical, and physical), ETL/ELT pipelines, and data   integration tools.
 -  Familiarity with modern data platforms, such as data lakes, lakehouses, and data mesh architectures. 
 -  Experience with real-time data streaming tools (e.g., Apache Kafka, AWS Kinesis) and API   integrations. 
 -  Leadership and Problem-Solving skills with excellent collaboration skills to work   cross-functionally with IT, business units, and senior stakeholders.
 -  Ability to translate   technical concepts into business-friendly language for non-technical   stakeholders. 
 
Experience
-  8-10 years of experience in data architecture, data engineering, or related roles.   
 -  Demonstrated experience designing and implementing enterprise-scale data platforms in a   manufacturing or industrial environment.
 -  Proven track record of delivering successful data solutions that enable analytics and decision-making
 
Education
-  Bachelors degree in Computer Science, Information Technology, Engineering or a related   field (Masters Degree preferred)
 
Certifications
-  Cloud Data Architect Certification (e.g., AWS Certified Data Analytics  Specialty,   Azure Data Engineer Associate, GCP Professional Data Engineer).
 -  DAMA Certified Data Management Professional (CDMP) (preferred).
 - Certiifications in big data tools or data modeling frameworks (e.g., Snowflake, Databricks)   are a plus.