Posted:3 months ago|
Platform:
Work from Office
Full Time
About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Snowflake Developern With DBT Experience:5+ years Location: Hyderabad/Pune/Gurgaon Technical Skill-Snowflake, DBT, Spark / Pyspark , Java/ Python, Scala, Any CLOUD Responsibilities: Hands-on working knowledge of Snowflake Architecture (Access Control, provisioning etc) Person should be good with Data transformation and processing using Data Build Tool. SnowPro Data engineering certification is a plus Teradata and Snowflake experience. Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git and deployment through Jenkins pipelines. Demonstrated experience developing in a continuous integration/continuous delivery (CI/CD) environment using tools like Jenkins, circleCI Frameworks. Demonstrated ability to maintain the build and deployment process through the use of build integration tools Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, New Relic, Signal FX and/or Splunk. Conduct knowledge-sharing sessions and publish case studies. Take accountability for maintaining program or project documents in a knowledge base repository. Identify accelerators and innovations. • Understand complex interdependencies to identify the right team composition for delivery. Required Skills Working experience and communicating with business stakeholders and architects Industry experience in developing relevant big data/ETL data warehouse experience building ccloud-nativedata pipelines Experience in Python, Pyspark, Scala, Java and SQL Strong Object and Functional programming experience in Python Experience working with REST and SOAP-based APIs to extract data for data pipelines Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc. Experience working in a public cloud environment, particularly AWS is mandatory Ability to implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena Experience in working with Real-time data streams and Kafka Platform. Working knowledge of workflow orchestration tools like Apache Airflow design and deploy dags. Hands-on experience with performance and scalability tuning Professional experience in Agile/Scrum application development using JIRA
GSPANN
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections GSPANN
10.0 - 20.0 Lacs P.A.
Gurgaon
10.0 - 20.0 Lacs P.A.
Chennai, Tamil Nadu, India
6.0 - 10.0 Lacs P.A.
Chennai, Tamil Nadu, India
7.0 - 10.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 7.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
3.0 - 7.0 Lacs P.A.
Delhi, Delhi, India
3.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
3.0 - 9.5 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
7.0 - 14.0 Lacs P.A.
Noida, Uttar Pradesh, India
7.0 - 14.0 Lacs P.A.