Sr. Programmer Analyst /Data Engineer/IT EDAAP

4.0 - 9.0 years

8.0 - 18.0 Lacs P.A.

Hyderabad, Bengaluru

Posted:1 week ago| Platform: Naukri logo

Apply Now

Skills Required

AirflowPythonData EngineeringGCPBigqueryAWSGoogle Cloud

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role & responsibilities Job Description: We are looking for an independent contributor experienced in Data Engineering space. Primary responsibilities include implementation of large-scale data processing (Structural, Statistical etc.,) Pipelines, creates production inference pipelines, associated APIs and analytics that support/provide insights for data driven decision making. Designs and develops data models, APIs, and pipelines to handle analytical workloads, data sharing, and movement across multiple systems at various grains in a large-scale data processing environment. Designs and maintains data systems and data structures for optimal read/write performance. Implements machine learning or statistical/heuristic learning in data pipelines based on input from Data Scientists. Roles and Responsibilities: Work in data streaming, movement, data modelling and data pipeline development Develop pipelines and data model changes in support of rapidly emerging business and project requirements Develop code and maintain systems to support analytics Infrastructure & Data Lake Partner/Contribute to data analysis and machine learning pipelines Design data recovery processes, alternate pipelines to check data quality. Create and maintain continuous data quality evaluation processes Optimize performance of the analytics platform and develop self-healing workflows Be a part of a global team and collaborate and co-develop solutions Qualifying Criteria: Bachelors degree in computer science, information technology, or engineering 5+ Years of prior experience in Data Engineering and Databases Experience with code based ETL framework like Airflow/Prefect Experience with Google Big Query, Google Pub Sub, Google Dataflow Experience building data pipelines on AWS or GCP Experience developing data APIs and pipelines using Python Experience with databases like MySQL/Postgres Experience with intermediate Python programming Experience with advanced SQL (analytical queries) "" Preferred Qualifications: Experience with Visualization tools like Tableau/QlikView/Looker Experience with building Machine Learning pipelines. Mandatory Skills: Data Engineering, Python, Airflow, AWS/ Google Cloud / GCP, Data Streaming, Data Lake, Data Pipelines, Google, Bigquerry, ETL, Google Pub sub, Google Data Flow, Rest API, MySQL, Postgre, SQL Analytics

Information Technology
Innovate City

RecommendedJobs for You