Posted:7 hours ago|
Platform:
On-site
Part Time
About the Role:
We are looking for a skilled Full Stack Data Engineer to join our growing data platform team. You will be responsible for designing, building, and maintaining scalable data pipelines and APIs that power internal tools and analytics, with deep integration across AWS services and DevOps best practices.
This is a cross-functional role that requires a strong foundation in backend engineering, cloud-native data workflows, and API development. You will work closely with data scientists, analysts, and product stakeholders to build robust, secure, and reliable data systems.
Key Responsibilities:
Design and build scalable ETL/ELT pipelines using AWS-native tools (Glue, Lambda, S3, DMS, Redshift, etc.)
Develop high-performance RESTful APIs using FastAPI, enabling secure and efficient access to data assets
Architect data platforms and microservices for ingestion, transformation, access control, and monitoring
Implement authentication & authorization mechanisms using OAuth2/JWT
Manage infrastructure and deployment using CI/CD pipelines (GitHub Actions, CodePipeline, etc.)
Write modular, testable, and well-documented Python code across backend and data workflows
Monitor, debug, and tune performance across cloud services and APIs
Collaborate with DevOps and Security teams to enforce best practices for data security, access controls, and compliance
Contribute to a culture of technical excellence, knowledge sharing, and continuous improvement
Must-Have Skills:
Strong experience with AWS data stack: S3, Glue, Athena, Redshift, Lambda, DMS, IAM, CloudWatch
Proficiency in Python, especially with frameworks like FastAPI and Pandas
Hands-on experience building and deploying RESTful APIs with JWT-based auth
Experience building data ingestion/transformation pipelines (structured/unstructured data)
Expertise in designing CI/CD workflows for automated testing, deployment, and rollback
Knowledge of SQL and performance tuning for cloud data warehouses
Familiarity with containerization & infrastructure as code tools like Docker & CDK
Version control experience with Git, and Agile/Scrum methodologies
Good-to-Have:
Experience with orchestration tools like Airflow, Prefect, or AWS Step Functions
Experience with Terraform for infrastructure as code
Exposure to data quality, observability, or lineage tools
Understanding of data security and compliance (GDPR, HIPAA, etc.)
Familiarity with ML model deployment pipelines is a plus
Srijan Technologies PVT LTD
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowExperience: Not specified
7.5 - 9.7 Lacs P.A.
Experience: Not specified
7.5 - 9.7 Lacs P.A.