The Enterprise Data organization has developed an integrated and intuitive data and analytics platform. This platform enables Lilly team members to quickly ingest, transform, consume, and analyze data sets in statistical environments, advanced analytics environments, and BI/visualization tools. Contributors can easily ingest, prepare, and analyze new data sets (cleanse, enhance, publish) for others to utilize.
Reporting to the Manager LCCI TechLilly, In this role, you will work closely with data engineers, business analyst, quality, data owners and stakeholders to efficiently manage, monitor and optimize the on-going flow of quality data to consumers for data sharing and analytics.
-
Monitor Data Pipelines: Ensure the timely and accurate flow of data through pipelines.
-
Incident Management: Detect, troubleshoot, and resolve issues in data pipelines to maintain data quality and integrity.
-
End user communication: Inform downstream teams of incidents or anomalies with data, quality, availability, or performance and expected resolution.
Root Cause Analysis: Review Incidents and Problems to Learn and Improve future processes.
-
Performance Optimization: Continuously optimize data pipeline performance ensuring timely availability of data.
-
Cloud FinOps: Continuously monitor data pipeline cost identifying and implementing cost-saving opportunities without compromising performance.
-
Data Quality Assurance: Implement measures to ensure data accuracy, consistency, and reliability.
-
Lifecycle management: Assess, execute, and test any necessary product upgrades to enabling services.
-
Cyber: Apply any required patches or changes for identified security vulnerabilities.
-
Configuration Changes: Execute configuration changes for configurable core components.
-
Automation: Develop and implement automation for monitoring and incident management processes.
-
Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and improve pipeline performance.
-
Documentation: Maintain comprehensive documentation of data operations processes, monitoring procedures, and issue resolution protocols.
-
Security and Compliance: Ensure data security and compliance with relevant processes and standard operating procedures.
-
Validation: Execute periodic reviews to ensure system remains secure and in a validated state.
-
Consult and Advise: On use of data products.
-
Strong decision-making capabilities and the ability to drive initiatives with clarity and purpose.
Qualifications / Skills:
-
Bachelors Degree or equivalent in Information Technology or related field.
-
5+ years of work experiences including Information Technology experience in multiple technical areas and roles.
-
Willingness to work in rotational shifts.
-
Strong analytical skills to troubleshoot and resolve issues quickly and efficiently.
-
Strong collaboration skills to work effectively with cross-functional-teams including data engineers, business analyst, data scientist and business stakeholders.
-
Strong communication skills to articulate technical concepts to non-technical stakeholders and document processes.
-
Flexibility to adapt to new technologies and methodologies as the data and technical landscape evolves.
-
Mastery of ETL processes and tools and SQL
-
Minimum of 3 years hands-on experience with AWS Services and Security (S3, RDS, Lambda, Glue, EC2, Redshift, CloudWatch, CloudTrail, IAM)
-
Experience with CI/CD, GitHub Actions and Apache Airflow
-
ITIL Foundations Certified or experience with incident, problem, event and change management best practices.
-
AWS Foundations Certified and/or AWS Certified DevOps Engineer
-
Experience with agile frameworks (like Kanban, SAFe, etc.) and solid understanding of associated practices and tools.
-
A high level of intellectual curiosity, external perspective, strong learning agility and innovation interest.