Home
Jobs

Senior Data engineer

3 - 7 years

20 - 25 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Duties for this role include but not limited to: supporting the design, build, test and maintain data pipelines at big data scale. Assists with updating data from multiple data sources. Work on batch processing of collected data and match its format to the stored data, make sure that the data is ready to be processed and analyzed. Assisting with keeping the ecosystem and the pipeline optimized and efficient, troubleshooting standard performance, data related problems and provide L3 support. Implementing parsers, validators, transformers and correlators to reformat, update and enhance the data. Provides recommendations to highly complex problems. Providing guidance to those in less senior positions.

Job

Duties for this role include but not limited to: supporting the design, build, test and maintain data pipelines at big data scale. Assists with updating data from multiple data sources. Work on batch processing of collected data and match its format to the stored data, make sure that the data is ready to be processed and analyzed. Assisting with keeping the ecosystem and the pipeline optimized and efficient, troubleshooting standard performance, data related problems and provide L3 support. Implementing parsers, validators, transformers and correlators to reformat, update and enhance the data. Provides recommendations to highly complex problems. Providing guidance to those in less senior positions.
Data Engineers play a pivotal role within Dataworks, focused on creating and driving engineering innovation and facilitating the delivery of key business initiatives. Acting as a universal translator between IT, business, software engineers and data scientists, data engineers collaborate across multi-disciplinary teams to deliver value. Data Engineers will work on those aspects of the Dataworks platform that govern the ingestion, transformation, and pipelining of data assets, both to end users within FedEx and into data products and services that may be externally facing. Day-to-day, they will be deeply involved in code reviews and large-scale deployments.

Essential Job Duties & Responsibilities

  • Understanding in depth both the business and technical problems Dataworks aims to solve
  • Building tools, platforms and pipelines to enable teams to clearly and cleanly analyze data, build models and drive decisions
  • Scaling up from laptop-scale to cluster scale problems, in terms of both infrastructure and problem structure and technique
  • Collaborating across teams to drive the generation of data driven operational insights that translate to high value optimized solutions.
  • Delivering tangible value very rapidly, collaborating with diverse teams of varying backgrounds and disciplines
  • Codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases
  • Interacting with senior technologists from the broader enterprise and outside of FedEx (partner ecosystems and customers) to create synergies and ensure smooth deployments to downstream operational systems

Skill/Knowledge Considered a Plus

  • Technical background in computer science, software engineering, database systems, distributed systems
  • Fluency with distributed and cloud environments and a deep understanding of optimizing computational considerations with theoretical properties
  • Experience in building robust cloud-based data engineering and curation solutions to create data products useful for numerous applications
  • Detailed knowledge of the Microsoft Azure tooling for large-scale data engineering efforts and deployments is highly preferred. Experience with any combination of the following azure tools: Azure Databricks, Azure Data Factory, Azure SQL D, Azure Synapse Analytics
  • Developing and operationalizing capabilities and solutions including under near real-time high-volume streaming conditions.
  • Hands-on development skills with the ability to work at the code level and help debug hard to resolve issues.
  • A compelling track record of designing and deploying large scale technical solutions, which deliver tangible, ongoing value
    • Direct experience having built and deployed robust, complex production systems that implement modern, data processing methods at scale
    • Ability to context-switch, to provide support to dispersed teams which may need an expert hacker to unblock an especially challenging technical obstacle, and to work through problems as they are still being defined
    • Demonstrated ability to deliver technical projects with a team, often working under tight time constraints to deliver value
    • An engineering mindset, willing to make rapid, pragmatic decisions to improve performance, accelerate progress or magnify impact
    • Comfort with working with distributed teams on code-based deliverables, using version control systems and code reviews
  • Ability to conduct data analysis, investigation, and lineage studies to document and enhance data quality and access
  • Use of agile and devops practices for project and software management including continuous integration and continuous delivery
  • Demonstrated expertise working with some of the following common languages and tools:
    • Spark (Scala and PySpark), Kafka and other high-volume data tools
    • SQL and NoSQL storage tools, such as MySQL, Postgres, MongoDB/CosmosDB
    • Java, Python data tools
    • Azure DevOps experience to track work, develop using git-integrated version control patterns, and build and utilize CI/CD pipelines
    • Working knowledge and experience implementing data architecture patterns to support varying business needs
    • Experience with different data types (json, xml, parquet, avro, unstructured) for both batch and streaming ingestions
    • Use of Azure Kubernetes Services, Eventhubs, or other related technologies to implement streaming ingestions
    • Experience developing and implementing alerting and monitoring frameworks
    • Working knowledge of Infrastructure as Code (IaC) through Terraform to create and deploy resources
    • Implementation experience across different data stores, messaging systems, and data processing engines
    • Data integration through APIs and/or REST service
    • PowerPlatform (PowerBI, PowerApp, PowerAutomate) development experience a plus

Analytical Skills, Accuracy & Attention to Detail, Planning & Organizing Skills, Influencing & Persuasion Skills, Presentation Skills

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
FedEx
FedEx

Logistics and Transportation

Memphis

RecommendedJobs for You

Chennai, Tamil Nadu, India

Hyderabad / Secunderabad, Telangana, Telangana, India

Chennai, Tamil Nadu, India