Posted:3 months ago|
Platform:
Remote
Full Time
We are looking for ETL Architect || Immediate Joiner Remote work- 5 Working Days Duration: Permanent-Fulltime. We are seeking a highly skilled and experienced ETL (Extract, Transform, Load) Technical Architect with expertise in AWS (Amazon Web Services) and PySpark to join our team. As the ETL Technical Architect, you will play a crucial role in designing and implementing ETL solutions that support data integration, transformation, and loading processes. Your deep knowledge of AWS services and PySpark will be instrumental in building scalable and efficient data pipelines. Requirements:- Architect ETL solutions: Design end-to-end ETL solutions that align with business requirements, ensuring scalability, reliability, and performance. Data Modeling: Define and implement data models, schemas, and structures to support efficient data processing and storage. ETL Pipeline Development: Develop, optimize, and maintain ETL pipelines using PySpark, leveraging AWS Glue or other relevant AWS services. AWS Integration: Integrate ETL processes with various AWS services, such as S3, Redshift, Athena, and EMR, to create a comprehensive data ecosystem. Performance Optimization: Identify and resolve performance bottlenecks in ETL processes, optimizing data extraction, transformation, and loading for efficiency. Data Quality and Governance: Implement data quality checks and governance policies to ensure data accuracy and compliance with industry standards and regulations. Monitoring and Troubleshooting: Set up monitoring and alerting systems to proactively identify and address issues within the ETL pipelines. Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand their data requirements and ensure successful data delivery. Documentation: Create and maintain technical documentation, including architecture diagrams, data flow diagrams, and standard operating procedures. Stay Current: Keep up-to-date with the latest ETL, AWS, and PySpark developments and best practices, and apply them to enhance existing processes. Qualifications:- 1) Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 2) Proven experience as an ETL Technical Architect with a strong focus on AWS and PySpark. 3) In-depth knowledge of AWS services, including but not limited to S3, Redshift, Athena, EMR, and Glue. 4) Proficiency in PySpark and related Big Data technologies for ETL processing. 5) Strong SQL skills for data manipulation and querying. 6) Familiarity with data warehousing concepts and dimensional modeling. 7) Experience with data governance, data quality, and data security practices. 8) Excellent problem-solving skills and attention to detail. 9) Strong communication and collaboration skills to work effectively with cross-functional teams. 10) AWS certifications (e.g., AWS Certified Data Analytics, AWS Certified Big Data) are a plus.
IGT Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections IGT Solutions
Delhi, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Salary: Not disclosed
10.0 - 15.0 Lacs P.A.
Gurgaon
20.0 - 30.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
4.0 - 7.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
6.0 - 11.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
10.0 - 15.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
10.0 - 17.0 Lacs P.A.
Chennai, Tamil Nadu, India
2.5 - 4.5 Lacs P.A.
Pune, Maharashtra, India
6.0 - 12.0 Lacs P.A.