Posted:2 weeks ago|
Platform:
Hybrid
Full Time
Job Overview We are looking for a highly capable and motivated Data Engineer to join our growing data team. The ideal candidate will be responsible for designing and implementing scalable data pipelines, enabling efficient migration of large data workloads into Snowflake , and integrating with various AWS services . This role requires deep knowledge of SQL, cloud data platforms, and a strong understanding of modern data engineering practices. Key Responsibilities Design and implement robust, scalable, and secure data pipelines for ingesting, transforming, and storing data in Snowflake Execute data migration strategies from on-prem or legacy systems (e.g., SQL Server, Oracle, Teradata) to Snowflake Integrate Snowflake with AWS components such as S3 , Glue , Lambda , and Step Functions Automate data ingestion using Snowpipe , Streams , and Tasks Write clean, efficient, and reusable SQL for transformations and data quality validations Monitor and tune Snowflake performance , including warehouse usage and query optimization Implement and enforce data governance , access control , and security best practices Collaborate with data analysts, architects, and business stakeholders to define data requirements Support development of data models (Star, Snowflake schemas) and metadata documentation Required Skills & Experience 3+ years of experience in a Data Engineering role Strong hands-on experience with Snowflake in production environments Proficiency in SQL (complex joins, CTEs, window functions, performance tuning) Solid experience with AWS services : S3, Glue, Lambda, IAM, Step Functions Proven experience in data migration projects Familiarity with ETL/ELT processes and data orchestration tools (e.g., Airflow , DBT , Informatica , Matillion ) Strong understanding of data warehousing , data modeling , and big data concepts Knowledge of version control (Git) and CI/CD pipelines for data workflows Preferred Qualifications Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or related field Snowflake SnowPro Certification AWS Certified Data Analytics or Solutions Architect certification Experience with scripting languages (e.g., Python , Shell ) Exposure to BI/visualization tools (e.g., Tableau, Power BI)
Globallogic
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Noida, Bengaluru
12.0 - 20.0 Lacs P.A.
Kolkata, Gurugram, Bengaluru
9.5 - 19.5 Lacs P.A.
Hyderabad
16.0 - 22.5 Lacs P.A.
Hyderabad, Chennai, Bengaluru
7.0 - 17.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
9.0 - 18.0 Lacs P.A.
Bengaluru
13.0 - 17.0 Lacs P.A.
Bengaluru
30.0 - 35.0 Lacs P.A.
Bengaluru
12.0 - 16.0 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
25.0 - 30.0 Lacs P.A.