Associate II - Data Engineering

2 - 5 years

7 - 11 Lacs

Posted:2 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

" Sql,Data Analysis,Ms Excel,Dashboards ","description":" Role Proficiency: This role requires proficiency in data pipeline development including coding testing and implementing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be adept at using ETL tools such as Informatica Glue Databricks and DataProc along with coding skills in Python PySpark and SQL. Works independently according to work allocation. Outcomes: Operate with minimal guidance to develop error-free code test applications and document the development process. Understand application features and component designs to develop them in accordance with user stories and requirements. Code debug test document and communicate the stages of product component or feature development. Develop optimized code using appropriate approaches and algorithms while adhering to standards and security guidelines independently. Complete foundational level certifications in Azure AWS or GCP. Demonstrate proficiency in writing advanced SQL queries. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule \/ timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical\/domain certifications Completion of all mandatory training requirements Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Documentation: Create comprehensive documentation for personal work and ensure it aligns with project standards. Configuration: Follow the configuration process diligently. Testing: Create and conduct unit tests for data pipelines and transformations to ensure data quality and correctness. Domain Relevance: Develop features and components with a solid understanding of the business problems being addressed for the client. Defect Management: Raise fix and retest defects in accordance with project standards. Estimation: Estimate time effort and resource dependencies for personal work. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Release Management: Adhere to the release management process for seamless deployment. Design Understanding: Understand the design and low-level design (LLD) and link it to requirements and user stories. Certifications: Obtain relevant technology certifications to enhance skills and knowledge. Skill Examples: Proficiency in SQL Python or other programming languages utilized for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc\/DataFlow and Azure ADF\/ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models Additional Comments: Job Description Strong written and verbal communication skills in English. - Ability to work in 24x7 shift schedules, including night shifts for extended periods. - Analytical and problem-solving skills to diagnose and address data-related issues. - Proficiency in writing SQL queries for data extraction and analysis. - Hands-on experience with MS Excel for data analysis. - Ability to work independently under minimal supervision while following SOPs. - Strong attention to detail and ability to manage multiple monitoring tasks effectively. As an L1 Data Ops Analyst, you will be responsible for monitoring data pipelines, dashboards, and databases to ensure smooth operations. You will follow Standard Operating Procedures (SOPs) and runbooks to identify, escalate, and resolve issues with minimal supervision. Strong analytical skills, attention to detail, and the ability to work in a fast-paced, 24x7 environment are critical for this role. Key Responsibilities: - Monitor various dashboards, s, and databases continuously for a 9-hour shift. - Identify and escalate system or data anomalies based on predefined thresholds. - Follow SOPs and runbooks to troubleshoot and resolve basic data issues. - Work closely with L2 and L3 support teams for issue escalation and resolution. - Write and execute basic SQL queries for data validation and troubleshooting. - Analyze and interpret data using MS Excel to identify trends or anomalies. - Maintain detailed logs of incidents, resolutions, and escalations. - Communicate effectively with stakeholders, both verbally and in writing. ","

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You