Posted:13 hours ago|
Platform:
Remote
Full Time
Total Experience - 10+ Years
Work Location - Trivandrum, Kochi& Remote
The selected candidate must work from the office during the first month.
Job Overview
We are seeking an experienced Senior Data Engineer to lead the development of a scalable data ingestion framework while ensuring high data quality and validation. The successful candidate will also be responsible for designing and implementing robust APIs for seamless data integration. This role is ideal for someone with deep expertise in building and managing big data pipelines using modern AWS-based technologies, and who is passionate about driving quality and efficiency in data processing systems
Required Qualifications
• Experience & Technical Skills: o Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development.
o Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation.
o AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks.
o Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift.
o API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems.
o CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies.
• Soft Skills: o Strong problem-solving abilities and attention to detail. o Excellent communication and interpersonal skills with the ability to work independently and collaboratively.
o Capacity to quickly learn and adapt to new technologies and evolving business requirements.
Preferred Qualifications
• Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
• Experience with additional AWS services such as Kinesis, Firehose, and SQS.
• Familiarity with data lakehouse architectures and modern data quality frameworks.
• Prior experience in a role that required proactive data quality management and API driven integrations in complex, multi-cluster environments.
• To adhere to the Information Security Management policies and procedures
Velodata Global Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now25.0 - 30.0 Lacs P.A.
ahmedabad
25.0 - 35.0 Lacs P.A.
bengaluru
35.0 - 55.0 Lacs P.A.
bengaluru
7.0 - 11.0 Lacs P.A.
kochi, kerala, india
Salary: Not disclosed
gurugram, haryana, india
Salary: Not disclosed
30.0 - 35.0 Lacs P.A.
7.0 - 11.0 Lacs P.A.
hyderabad
9.0 - 14.0 Lacs P.A.
gurgaon, haryana, india
5.0 - 10.0 Lacs P.A.