About Us: Zeza is a vibrant team of programmers, data engineers, and data scientists passionate about innovation. We specialize in using data to drive business growth, solve complex challenges, and optimize operations. Through advanced analytics, AI, and machine learning, we transform raw data into actionable insights, helping businesses make informed decisions and achieve lasting success. Job Description: We are seeking a skilled Data Engineer with strong expertise in PySpark and AWS to design and build robust data pipelines, leverage AWS SageMaker for machine learning workflows, and develop BI dashboards. The ideal candidate should have proven experience in handling large datasets, optimizing data processing, and delivering efficient, scalable solutions. Key Responsibilities: · Design, develop, and maintain scalable data pipelines using PySpark and AWS data services. · Implement efficient ETL processes to handle large, complex datasets. · Integrate and manage AWS SageMaker workflows for machine learning models. · Collaborate with data scientists, analysts, and stakeholders to deliver accurate and timely data solutions. · Develop and maintain BI dashboards to support business decision-making. · Ensure data quality, reliability, and security across all platforms. · Optimize data storage and processing for performance and cost-efficiency. Required Skills & Qualifications: Experience: 4+years in data engineering, ETL pipeline creation, and cloud technologies. Technical Skills: Hands-on experience with AWS services (S3, Lambda, Glue, SageMaker, etc.). Strong knowledge of SQL, Python, or Spark . Expertise in BI tools (Power BI, Tableau, or similar). Understanding of data warehousing and modeling . Ability to work independently and meet project deadlines.
As an AWS Developer at our company, you will be responsible for designing, developing, and deploying cloud-native applications using AWS services. With over 4 years of hands-on experience, you should be well-versed in serverless architecture, infrastructure as code, and building scalable and secure cloud solutions. Your key responsibilities will include: - Developing and deploying scalable applications using AWS services such as Lambda, EC2, S3, API Gateway, RDS, DynamoDB, etc. - Implementing and managing CI/CD pipelines using AWS Code Pipeline, Code Build, and related tools. - Working with infrastructure as code (IaC) tools like CloudFormation or Terraform. - Ensuring cloud architecture is secure, cost-optimized, and aligns with industry best practices. - Collaborating with DevOps, QA, and product teams to deliver high-quality solutions. - Troubleshooting, optimizing, and maintaining AWS-based applications and environments. Qualifications required for this position include: - Strong proficiency in at least one programming language (e.g., Python, Node.js, Java). - Hands-on experience with AWS services and cloud-native architectures. - Good understanding of networking, security, and monitoring within AWS. - Experience with container services like ECS or EKS is a plus. - Familiarity with Agile methodologies and tools like Git, Jira, etc.,