Hybrid
Full Time
Hi, We have a urgent opening for TechOps-DE-Cloudops-Senior for Bangalore location. The opportunity As a Senior Data Engineer, this role will play a pivotal role in managing and optimizing large-scale data architectures that are crucial for providing valuable insights to business users and downstream systems. We are looking for an innovative and experienced professional who is adept at overseeing data flow from diverse sources and ensuring the continuous operation of production systems. Your expertise will be instrumental in maintaining data platforms that empower front-end analytics, contributing to the effectiveness of Takedas dashboards and reporting tools. As a key member of the Analytics Production Support team, you will ensure seamless end-to-end data flow and coordinate with stakeholders and team members across various regions, including India and Mexico. The ability to manage major incidents effectively, including handling Major Incident Management (MIM) bridges, is crucial and flexible to work in 24x7x365 support model. Your key responsibilities Manage and maintain the Data Pipeline (ETL/ ELT Layer) to guarantee high availability and performance. Resolve Data Quality issues within the Service Level Agreement (SLA) parameters by coordinating with cross-functional teams and stakeholders. Proactively monitor the system and take pre-emptive measures against alerts such as Databricks job failures and data quality issues. Monitor and maintain AWS data services, including S3, DMS, Step Functions, and Lambda, to ensure efficient and reliable data loading processes Conduct thorough analyses of code repositories to understand Databricks job failures and determine appropriate corrective actions. Take ownership of support tickets, ensuring timely and effective resolution. Manage major incidents with meticulous attention to detail, ensuring compliance with regulatory requirements and effective data presentation. Perform root cause analysis for major incidents, recurring incidents and propose solutions for permanent resolution. Identify and execute automation opportunities to enhance operational efficiency. Escalate complex issues to the next level of support to ensure a swift resolution. Mentor junior team members, providing a structured training plan for skill enhancement and professional growth. . Skills and attributes for success 3 to 8 years of experience in Data Analytics, with a focus on maintaining and supporting ETL data pipelines using Databricks & AWS Services Proficiency in Databricks and PySpark for code debugging and root cause analysis. Proven experience in a Production Support environment and readiness to work in a 24x7 support model. Strong understanding of: Relational SQL databases. Data Engineering Programming Languages (e.g., Python). Distributed Data Technologies (e.g., PySpark). Cloud platform deployment and tools (e.g., Kubernetes). AWS cloud services and technologies (e.g., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS). Databricks/ETL processes. Familiarity with ITIL principles. Effective communication skills for collaborating with multifunctional teams and strategic partners. Strong problem-solving and troubleshooting abilities. Capacity to thrive in a dynamic environment and adapt to evolving business needs. Commitment to continuous integration and delivery principles to automate code deployment and improve code quality. Familiarity with the following tools and technologies will be considered as an added advantage: Power BI Informatica Intelligent Cloud Services (IICS) Tidal. Must be a proactive learner, eager to cross-skill and advance within our innovative team environment. Databricks Associate Certification is required. ITIL 4 Foundation Level Certification is a plus. To qualify for the role, you must have Databricks Associate Certification Relational SQL databases. Data Engineering Programming Languages (e.g., Python). Distributed Data Technologies (e.g., PySpark). Cloud platform deployment and tools (e.g., Kubernetes). AWS cloud services and technologies (e.g., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS). Databricks/ETL processesWhat we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations Argentina, China, India, the Philippines, Poland and the UK and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Well introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: Youll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs.
EY
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections EY
Bengaluru
4.0 - 9.0 Lacs P.A.
Chennai, Tamil Nadu, India
6.0 - 10.0 Lacs P.A.
Chennai, Tamil Nadu, India
7.0 - 10.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 7.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
3.0 - 7.0 Lacs P.A.
Delhi, Delhi, India
3.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
3.0 - 9.5 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
7.0 - 14.0 Lacs P.A.
Noida, Uttar Pradesh, India
7.0 - 14.0 Lacs P.A.
Patan - Gujarat, Gujrat, India
4.0 - 11.0 Lacs P.A.