Posted:1 day ago|
Platform:
Work from Office
Full Time
Responsibilities : - Design, develop, and maintain scalable and efficient ETL/ELT pipelines using appropriate tools and technologies. - Develop and optimize complex SQL queries for data extraction, transformation, and loading. - Implement data quality checks and validation processes to ensure data integrity. - Automate data pipelines and workflows for efficient data processing. - Integrate data from diverse sources, including databases, APIs, and flat files. - Manage and maintain data warehouses and data lakes. - Implement data modeling and schema design. - Ensure data security and compliance with relevant regulations. - Provide data support for BI and reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Collaborate with BI developers to ensure data availability and accuracy. - Optimize data queries and performance for reporting applications. - Provide technical guidance and mentorship to junior data engineers. - Lead code reviews and ensure adherence to coding standards and best practices. - Contribute to the development of technical documentation and knowledge sharing. - Design and implement data solutions on cloud platforms (AWS preferred). - Utilize AWS data integration technologies such as Airflow and Glue. - Manage and optimize cloud-based data infrastructure. - Develop data processing applications using Python, Java, or Scala. - Implement data transformations and algorithms using programming languages. - Identify and resolve complex data-related issues. - Proactively seek opportunities to improve data processes and technologies. -Stay up-to-date with the latest data engineering trends and technologies. Requirements : Experience : - 5 to 10 years of experience in Business Intelligence and Data Engineering. - Proven experience in designing and implementing ETL/ELT processes. - Expert-level proficiency in SQL (advanced/complex queries). - Strong understanding of ETL concepts and experience with ETL/ Data Integration tools (Informatica, ODI, Pentaho, etc.). - Familiarity with one or more reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Knowledge of Python and cloud infrastructure (AWS preferred). - Experience with AWS data integration technologies (Airflow, Glue). - Programming experience in Java or Scala. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Proven ability to take initiative and be innovative. - Ability to work independently and as part of a team. Education : - B.Tech / M.Tech / MCA (Must-Have).
Bizacuity Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Information Technology / Data Analytics
51-200 Employees
22 Jobs
Key People
9.0 - 14.0 Lacs P.A.
8.0 - 14.0 Lacs P.A.
Bengaluru
6.0 - 10.0 Lacs P.A.
8.0 - 9.0 Lacs P.A.
8.0 - 9.0 Lacs P.A.
Hyderabad
11.0 - 12.0 Lacs P.A.
18.0 - 20.0 Lacs P.A.
15.0 - 19.0 Lacs P.A.
11.0 - 16.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.