Data Systems Engineer

5 - 10 years

0 Lacs

Posted:3 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


As a Systems Data Engineer, you will be part of the EBS Reporting Team under Qorvos IT Enterprise Business Applications organization. Your primary focus will be on leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS government cloud. You will manage key client technical projects and workstreams, coordinating the work of more junior engineers, and often working alongside them. You will also partner closely with business analysts and project managers to complete projects on time, within budget and scope, and with high customer satisfaction.

This is an onsite position based in Bangalore, India, with the current expectation of working five days (5) a week in the office.

Qualifications

  • B.S. in Computer Science/Engineering or relevant field; Masters degree preferred
  • 5+ years of experience in the IT industry
  • 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure and functions
  • Expert understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized) implementing highly performant data ingestion pipelines from multiple sources
  • Expert level skills with Python / PySpark and SQL
  • Experience with CI/CD on Databricks using tools such as Unity Catalog, Jenkins, GitHub Actions, and Databricks CLI
  • Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
  • Strong understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloging)
  • Evaluating the performance and applicability of multiple tools against customer requirements
  • Working within an Agile delivery/DevOps methodology to deliver proof of concept and production implementation in iterative sprints
  • Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT)
  • Hands on experience developing batch and streaming data pipelines
  • Able to work independently
  • Energetic and self-motivated, willingness to learn and openness to change are important
  • Ability to work in a fast-paced, changing environment, and with all levels of the organization and cope with rapidly changing information
  • Experience with SAP ECC or S/4, AWS Redshift, Power BI
  • Experience consuming CDS views from SAP S/4

Responsibilities

  • Establish and grow a data engineering framework to ensure the reliability, scalability, quality, and efficiency of data pipelines, storage, processing, and integration
  • Establish data pipelines to ingest and curate data containing SAP business content from S/4 to Databricks
  • Improve, maintain and execute the data strategy at Qorvo including governance, project prioritization, resourcing, and value delivery
  • Follow the Medalion Architecture (Bronze, Silver, Gold) to logically organize data in a lakehouse, with the goal of incrementally and progressively improving the structure and quality of data as it flows
  • Work effectively in an Agile Scrum environment
  • Create technical, functional, and operational documentation for data pipelines and applications
  • Use business requirements to drive the design of data solutions/applications and technical architecture
  • Work with other developers, designers, and architects to ensure data applications meet requirements and performance, data security, and analytics goals
  • Work with test team efficiently and effectively structure requirements, define test scenarios, and validate changes.
  • Anticipate, identify, track, and resolve issues and risks affecting delivery
  • Coordinate and participate in structured peer review/ walkthroughs/code reviews
  • Provide application/technical support
  • Maintain and/or update technical and/or industry knowledge and skills through continuous learning activities
  • Adhere to lean principles and standard processes to ensure continuous improvement
  • Communicate clearly and effectively

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bangalore rural, bengaluru

Itanagar, Arunachal Pradesh, India

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru