Posted:5 months ago|
Platform:
Work from Office
Full Time
ETL Development :
- Design, develop, and maintain efficient ETL processes for handling multi-scale datasets.
- Implement and optimize data transformation and validation processes to ensure data accuracy and consistency.
- Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows.
Data Pipeline Architecture :
- Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow.
- Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines.
- Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis.
Data Modeling :
- Design and implement data models to support analytics and reporting needs across teams.
- Optimize database structures to enhance performance and scalability.
Data Quality and Governance :
- Develop and implement data quality checks and governance processes to ensure data integrity.
- Collaborate with stakeholders to define and enforce data quality standards across the organization.
Documentation and Communication :
- Maintain detailed documentation of ETL processes, data models, and other key workflows.
- Effectively communicate complex technical concepts to non-technical stakeholders and business users.
Cross-Functional Collaboration :
- Work closely with the Quant team and developers to design and optimize data pipelines.
- Collaborate with external stakeholders to understand business requirements and translate them into technical solutions.
- Bachelor's degree in Computer Science, Information Technology, or a related field.Familiarity with big data technologies like Hadoop, Spark, and Kafka.
- Experience with data modeling tools and techniques.
- Excellent problem-solving, analytical, and communication skills.
- Proven experience as a Data Engineer with expertise in ETL techniques (minimum years).
- 3-6 years of strong programming experience in languages such as Python, Java, or Scala
- Hands-on experience in web scraping to extract and transform data from publicly available web sources.
- Proficiency with cloud-based data platforms such as AWS, Azure, or GCP.
- Strong knowledge of SQL and experience with relational and non-relational databases.
- Deep understanding of data warehousing concepts and architectures.
- Master's degree in Computer Science or Data Science.
- Knowledge of data streaming and real-time processing frameworks.
- Familiarity with data governance and security best practices.
Emperen Technologies
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Nowindore, madhya pradesh
Salary: Not disclosed
Mumbai
7.0 - 10.0 Lacs P.A.
Bengaluru
5.0 - 15.0 Lacs P.A.
Bengaluru
5.0 - 15.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
Bengaluru, Karnataka, India
Salary: Not disclosed
Bengaluru
5.0 - 9.0 Lacs P.A.
Hyderābād
4.0 - 7.0 Lacs P.A.
Nagercoil
Salary: Not disclosed
Salary: Not disclosed