Posted:3 months ago|
Platform:
Hybrid
Full Time
Hands-on experience with AWS Glue or Databricks, PySpark, and Python. Minimum of 2 years of hands-on expertise in PySpark, including Spark job performance optimization techniques. Minimum of 2 years of hands-on involvement with AWS Cloud Hands on experience in StepFunction, Lambda, S3, Secret Manager, Snowflake/Redshift, RDS, Cloudwatch Proficiency in crafting low-level designs for data warehousing solutions on AWS cloud. Proven track record of implementing big-data solutions within the AWS ecosystem including Data Lakes. Familiarity with data warehousing, data quality assurance, and monitoring practices. Demonstrated capability in constructing scalable data pipelines and ETL processes. Proficiency in testing methodologies and validating data pipelines. Experience with or working knowledge of DevOps environments. Practical experience in Data security services. Understanding of data modeling, integration, and design principles. Strong communication and analytical skills. A dedicated team player with a goal-oriented mindset, committed to delivering quality work with attention to detail.
D-TechWorks Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections D-TechWorks Pvt Ltd
Pune, hybrid
20.0 - 25.0 Lacs P.A.