Posted:1 hour ago|
Platform:
Work from Office
Full Time
We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance
Wits Innovation Lab
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Hyderabad
9.0 - 19.0 Lacs P.A.
Chennai
25.0 - 30.0 Lacs P.A.
Ahmedabad
4.0 - 9.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
Mumbai
7.0 - 12.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Bengaluru
22.5 - 25.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
Bengaluru
30.0 - 35.0 Lacs P.A.