Posted:12 hours ago|
Platform:
Remote
Full Time
Experience: Experience in Data Engineering Programming & Scripting: Strong programming skills in Python and Linux Bash for automation and data workflows. Hands on experience in Snowflake. AWS Cloud Services: In-depth knowledge of AWS EC2, S3, RDS, and EMR to deploy and manage data solutions. Framework Proficiency: Hands-on experience with Luigi for orchestrating complex data workflows. Data Processing & Storage: Expertise in Hadoop ecosystem tools and managing SQL databases for data storage and query optimization. Security Practices: Understanding of data security practices, data governance, and compliance for secure data processing. Automation & CI/CD: Familiarity with CI/CD tools to support automation of deployment and testing. Big Data Technologies: Knowledge of big data processing tools like Spark, Hive, or related AWS services. Should have Good Communication Skills. Regards Rajan You can WhatsApp me your CV also 9270558628
3Pillar
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Chennai
8.0 - 11.0 Lacs P.A.
Bengaluru
6.0 - 10.0 Lacs P.A.
Bengaluru
15.0 - 19.0 Lacs P.A.
Noida
20.0 - 30.0 Lacs P.A.
Pune, Maharashtra, India
Salary: Not disclosed
Kerala, India
Salary: Not disclosed
Gurgaon, Haryana, India
Salary: Not disclosed
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
Pune, Maharashtra, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Salary: Not disclosed