Posted:3 weeks ago|
Platform:
Work from Office
Full Time
1)Possesses 2-4 years of experience in data engineering, particularly using AWS services like S3, Glue, Lambda, SNS, EC2, IAM and KMS, and Snowflake features like snowpipe, streams, tasks stored procedures.
2)Capable of handling well defined task and completing these tasks by writing readable, maintainable code. Adhered to coding practices and data protection standards.
3)Should have a solid understanding of BI/DW, ETL, big data, cloud, data security & protection, SQL, and data visualization concepts. 4)Develop code considering established controls, policies, recommended strategies for development work.
5)Understand basic data management process designs, models, and architectures by engaging in requirement analysis, testing, debugging, documentation, quality review.
Qualifications Must have Skills: Extensive experience with AWS services: S3, Glue, Lambda, SNS, EC2, IAM and KMS Proficency in CI/CD Pipelines in GitHub, and Snowflake features like streams, tasks, stored procedures, snowpipe and data sharing.
Good to have Skills: Hands on experience/Familiarity with one/more of the following technologies: Database, Data Visualization (Power BI), Scripting, ETL (Informatica PowerCenter). Nice to have AWS Cloud certification.
Principal Global Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
hyderabad
20.0 - 32.5 Lacs P.A.
hyderābād
5.3125 - 7.225 Lacs P.A.
bengaluru
7.0 - 9.0 Lacs P.A.
bengaluru
25.0 - 35.0 Lacs P.A.
greater kolkata area
Salary: Not disclosed
hyderabad, telangana, india
Salary: Not disclosed
greater kolkata area
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed
india
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed