Posted:23 hours ago|
Platform:
On-site
Full Time
Job Role: Data Engineer
Experience: 7+ Years
Mode: Hybrid
Key Responsibilities:
• Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation).
• Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion,
storage, computing and processing.
• Optimize cloud infrastructure for performance, scalability, and cost-effectiveness.
• Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines
managing in DevOps.
• Create and manage robust Data Warehousing solutions using technologies such as Redshift.
• Ensure high data quality and integrity across all pipelines.
• Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik.
• Collaborate with business stakeholders to define key metrics and deliver actionable insights.
• Implement best practices for data encryption, secure data transfer, and role-based access control.
• Lead audits and compliance certifications to maintain organizational standards.
• Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers.
• Mentor junior team members and provide technical guidance for complex projects.
• Partner with stakeholders to define and align data strategies that meet business objectives.
Qualifications & Skills:
• Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack.
• Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift,
Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM.
• Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance.
• Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DB’s (like DynamoDB,
MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming
applications preferably in Spark, Kafka, or other streaming platforms.
• AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles,
• Encryption, KMS, Secrets Manager.
• Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouse’s or Delta
Lake. Proficiency in visualization tools such as Tableau, Power BI, or Qlik.
• Strong problem-solving skills and ability to debug and optimize application applications for performance.
• Strong understanding of Database/SQL for database operations and data management.
• Familiarity with CI/CD pipelines and version control systems like Git.
• Strong understanding of Agile methodologies and working within scrum teams.
Preferred Qualifications:
• Bachelor of Engineering degree in Computer Science, Information Technology, or a related field.
• AWS Certified Solutions Architect – Associate (required).
• Experience with Agile/Scrum methodologies and design sprints.
Allnessjobs
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowpune, mumbai city
0.00014 - 0.00016 Lacs P.A.
pune, mumbai city
0.00014 - 0.00016 Lacs P.A.
Hyderabad, Telangana, India
Salary: Not disclosed
Noida, Uttar Pradesh, India
Salary: Not disclosed
Noida, Uttar Pradesh, India
Salary: Not disclosed
Experience: Not specified
5.0 - 6.0 Lacs P.A.
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Jagatpura, Jaipur, Rajasthan
Experience: Not specified
0.25316 - 0.45796 Lacs P.A.
Hyderabad, Telangana, India
4.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
Salary: Not disclosed