Associate III - Data Engineering

3 - 5 years

5 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role Proficiency:

This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc with coding skills in Python PySpark and SQL. Works independently and demonstrates proficiency in at least one domain related to data with a solid understanding of SCD concepts and data warehousing principles.

Outcomes:

  1. Collaborate closely with data analysts data scientists and other stakeholders to ensure data accessibility quality and security across various data sources.rnDesign develop and maintain data pipelines that collect process and transform large volumes of data from various sources.
  2. Implement ETL (Extract Transform Load) processes to facilitate efficient data movement and transformation.
  3. Integrate data from multiple sources including databases APIs cloud services and third-party data providers.
  4. Establish data quality checks and validation procedures to ensure data accuracy completeness and consistency.
  5. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes.
  6. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools.

Measures of Outcomes:

  1. Adherence to engineering processes and standards
  2. Adherence to schedule / timelines
  3. Adhere to SLAs where applicable
  4. # of defects post delivery
  5. # of non-compliance issues
  6. Reduction of reoccurrence of known defects
  7. Quickly turnaround production bugs
  8. Completion of applicable technical/domain certifications
  9. Completion of all mandatory training requirementst
  10. Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).
  11. Average time to detect respond to and resolve pipeline failures or data issues.

Outputs Expected:

Code Development:

  1. Develop data processing code independently
    ensuring it meets performance and scalability requirements.


Documentation:

  1. Create documentation for personal work and review deliverable documents
    including source-target mappingstest casesand results.


Configuration:

  1. Follow configuration processes diligently.


Testing:

  1. Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness.
  2. Validate the accuracy and performance of data processes.


Domain Relevance:

  1. Develop features and components with a solid understanding of the business problems being addressed for the client.
  2. Understand data schemas in relation to domain-specific contexts
    such as EDI formats.


Defect Management:

  1. Raise
    fixand retest defects in accordance with project standards.


Estimation:

  1. Estimate time
    effortand resource dependencies for personal work.


Knowledge Management:

  1. Consume and contribute to project-related documents
    SharePointlibrariesand client universities.


Design Understanding:

  1. Understand design and low-level design (LLD) and link it to requirements and user stories.


Certifications:

  1. Obtain relevant technology certifications to enhance skills and knowledge.

Skill Examples:

  1. Proficiency in SQL Python or other programming languages utilized for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning data processes.
  6. Proficiency in querying data warehouses.

Knowledge Examples:

Knowledge Examples

  1. Knowledge of various ETL services provided by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow and Azure ADF/ADLF.
  2. Understanding of data warehousing principles and practices.
  3. Proficiency in SQL for analytics including windowing functions.
  4. Familiarity with data schemas and models.
  5. Understanding of domain-related data and its implications.

Additional Comments:

Migration Consultant Role: Responsibilities Required During the Migration: Meet with all of our customers to help them re-setup their instance in AWS. This individual will work with the PM to understand the customer's unique integration setup and the PM will provide a check-list of items unique to the customer that need to be reconfigured to complete the migration. Work with project team to identify customer's custom integrations and with Migration Engineer develop a plan for migration Educate customers on how to set up SSO (ADFS, Azure, Okta, OneLogin, Google SAML, etc) and 3rd party integrations (i.e., SharePoint, Confluence, Google Drive, Slack, Box...etc.,) Assist in manage enterprise level customer relationships and address customer escalations Demonstrate migration process best practices applied across a breadth of technologies to solve problems, articulate and translate technical customer issues to engineering team Validate recon report to confirm migration success with customer Advise customers on the differences business process changes between the two Simpplr platforms (SFDC & AWS) Assist the customer with technical troubleshooting in both SFDC and AWS environments during the migration process Collaborate with the customer to understand technical issues to translate requirements to internal Simpplr program manager, product teams and engineering teams Identify areas to improve process efficiency and communicate items to Program Manager to resolve Migration Consultant Qualifications: 5+ years of hands-on technical implementation or consulting experience 3+ years of experience in customer/partner facing in technical positions Bachelor's Degree in Technical Discipline or related field Experience with large scale / enterprise customers Experience leading the process of executing on migrating from SFDC to AWS cloud systems Strong verbal and written communication skills; high degree of comfort with technical and customer audiences Excellent problem-solving skills and the ability to troubleshoot complex technical issues Led a SaaS based product migration Strong interpersonal & communication skills; able to quickly establish a trusted advisor relationship with customers

Required Skills

Sso,Aws,Sfdc,Migration

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
UST logo
UST

IT Services and IT Consulting

Aliso Viejo CA

RecommendedJobs for You