Jobs
Interviews

2 Aws G Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

delhi

On-site

As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,

Posted 2 days ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bengaluru

Hybrid

We are looking for a Cloud Data Engineer with strong hands-on experience in data pipelines, cloud-native services (AWS), and modern data platforms like Snowflake or Databricks. Alternatively, were open to Data Visualization Analysts with strong BI experience and exposure to data engineering or pipelines. You will collaborate with technology and business leads to build scalable data solutions, including data lakes, data marts, and virtualization layers using tools like Starburst. This is an exciting opportunity to work with modern cloud tech in a dynamic, enterprise-scale financial services environment. Key Responsibilities: Design and develop data pipelines for structured/unstructured data in AWS. Build semantic layers and virtualization layers using Starburst or similar tools. Create intuitive dashboards and reports using Power BI/Tableau. Collaborate on ETL designs and support testing (SIT/UAT). Optimize Spark jobs and ETL performance. Implement data quality checks and validation frameworks. Translate business requirements into scalable technical solutions. Participate in design reviews and documentation. Skills & Qualifications: Must-Have: 10+ years in Data Engineering or related roles. Hands-on with AWS Glue, Redshift, Athena, EMR, Lambda, S3, Kinesis. Proficient in HiveQL, Spark, Python, Scala. Experience with modern data platforms (Snowflake/Databricks). 3+ years in ETL tools (Informatica, SSIS) & recent experience in cloud-based ETL. Strong understanding of Data Warehousing, Data Lakes, and Data Mesh. Preferred: Exposure to Data Virtualization tools like Starburst or Denodo. Experience in financial services or banking domain. AWS Certification (Data specialty) is a plus.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies