Jobs
Interviews

4 Time Travel Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

35 - 45 Lacs

Noida, Pune, Bengaluru

Hybrid

Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero-copy clone, time travel, and automating them. Experience in handling semi-structured data (JSON, XML), and columnar PARQUET using the VARIANT attribute in Snowflake. Experience in re-clustering the data in Snowflake with a good understanding of Micro-Partitions. Experience in Migration processes to Snowflake from an on-premises database environment. Experience in designing and building manual or auto-ingestion data pipelines using Snowpipe. SnowSQL experience in developing stored procedures and writing queries to analyze and transform data. Must have skills - Certified Snowflake Architect, Snowflake Architecture, Snow Pipes, SnowSQL, SQL, CI/CD and Python Perks and benefits Competitive compensation package. Opportunity to work with industry leaders. Collaborative and innovative work environment. Professional growth and development opportunities.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

20 - 30 Lacs

Mumbai, Pune, Bengaluru

Hybrid

Key Responsibilities: Support Snowflake Native Apps built by developers across the enterprise Review the design, development, and optimization of native apps using Snowflake and Snowpark Container Services Troubleshoot complex SQL queries, UDFs , and Python scripts for data processing in client environments Engage directly with clients to understand business needs, present technical designs, and resolve application use issues Ensure data quality, security, and performance across all stages of the data lifecycle Coordinate with support engineers and provide a bridge to data engineering Collaborate with cross-functional teams including Product, Analytics, and DevOps Required Skills & Experience: 6+ years of experience in Data Engineering or related field 3+ years of experience working in client-facing roles , including requirement gathering, solutioning, and demos Hands-on expertise with Snowflake (warehouse management, resource monitoring, Snow pipe, etc.) Strong SQL programming and performance tuning skills Proficiency in Python , including creating and managing UDFs Experience building or supporting Snowflake native applications Familiarity with Snowpark Container Services and deploying containerized workloads in Snowflake Good to Have Skills: Strong understanding of data modelling, ETL/ELT processes, and cloud data architecture Excellent problem-solving, communication, and leadership skills Preferred Qualifications: SnowPro certifications Experience with CI/CD pipelines Exposure to Tableau, Power BI, or other visualisation tools (nice to have) Leader of client-support or escalation team Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

Ahmedabad, Chennai, Bengaluru

Hybrid

Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5+ Years Location - Pan India Immediate Joiners only Snowflake developer , Pyspark , Python , API, CI/CD , Cloud services ,Azure , Azure Devops Subject: Fw : TMNA SNOWFLAKE POSITION Please share profiles for Snowflake developers having strong Pyspark experience Job Description: Strong hands-on experience in Snowflake development including Streams, Tasks, and Time Travel Deep understanding of Snowpark for Python and its application for data engineering workflows Proficient in PySpark , Spark SQL, and distributed data processing Experience with API development . Proficiency in cloud services (preferably Azure, but AWS/GCP also acceptable) Solid understanding of CI/CD practices and tools like Azure DevOps, GitHub Actions, GitLab, or Jenkins for snowflake. Knowledge of Delta Lake, Data Lakehouse principles, and schema evolution is a plus Preferred candidate profile

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies