Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
10 - 20 Lacs
Gurugram
Work from Office
Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 2 weeks ago
12.0 - 15.0 years
37 - 40 Lacs
Pune
Work from Office
We are looking for immediate joiner + we can join us within 30 Days for below position Primary Role We are looking for a seasoned Snowflake Architect/Lead to design, build, and optimize enterprise-scale data warehouse solutions on the Snowflake Data Cloud. The ideal candidate will have extensive experience in Snowflake architecture, cloud data platforms, and enterprise data warehousing, with a strong background in the banking domain and Azure cloud. This role requires leadership skills to guide technical teams and deliver robust, scalable data solutions aligned with business needs. Key Responsibilities: Architect and lead the implementation of scalable, secure, and high-performance Snowflake data warehouse solutions, including enterprise data warehouse (EDW) projects. Develop and oversee ETL/ELT pipelines using tools such as DBT, ensuring data quality, transformation, and orchestration. Manage Snowflake environments on Azure cloud, leveraging cloud-native features for compute and storage scalability. Implement data governance, security policies, and compliance standards suitable for the banking domain. Collaborate with cross-functional teams including data engineers, analysts, and business stakeholders to translate requirements into technical architecture. Lead performance tuning, query optimization, and cost management strategies within Snowflake. Mentor and lead data engineering teams, enforcing best practices and coding standards. Stay updated with Snowflake platform advancements and cloud data trends to drive continuous improvement. Support solution design, code reviews, and architectural governance. Desired Skills Qualification Bachelors degree in science, Engineering and related disciplines. Work Experience 12+ years of experience in data engineering, data architecture, or related roles. Minimum 3-5 years of hands-on experience architecting and implementing Snowflake solutions. Proven experience implementing enterprise data warehouses (EDW) on Snowflake. Strong experience with Azure cloud platform and its integration with Snowflake. Expertise in DBT for data transformations and pipeline orchestration. Deep knowledge of banking domain data requirements, compliance, and security. Proficiency in SQL and ETL Experience with cloud data platform concepts such as separation of compute and storage, multi-cluster architecture, and scalability. Solid understanding of data modeling (star/snowflake schemas) and data governance. Strong leadership, communication, and stakeholder management skills. Experience leading and mentoring technical teams.
Posted 1 month ago
6.0 - 11.0 years
17 - 30 Lacs
Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 1 month ago
10.0 - 15.0 years
3 - 6 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
We are seeking a highly skilled Snowflake Architect with extensive experience in Snowflake, Azure, and Azure Data Factory (ADF) to join our growing team. This role will involve designing and implementing large-scale data solutions on the cloud platform, leading the architecture of the data ecosystem, and ensuring high levels of data integration, optimization, and governance. The ideal candidate will have a deep understanding of Snowflake data warehouse architecture and be ready to work closely with clients to deliver scalable data solutions. The role requires a leader who can guide teams and manage projects effectively while ensuring seamless integration of data systems across different platforms. Key Responsibilities: Lead the design and architecture of data solutions using Snowflake and Azure technologies. Oversee the development and optimization of ETL pipelines using Azure Data Factory (ADF), ensuring smooth data movement, transformation, and integration. Architect data workflows and ensure alignment with best practices for data governance, security, and compliance. Monitor, troubleshoot, and optimize data pipelines, ensuring performance, scalability, and reliability in Snowflake and Azure environments. Work closely with cross-functional teams to design end-to-end data solutions that meet business needs and ensure seamless integration with existing systems. Lead data validation efforts and ensure data integrity and quality across multiple systems. Collaborate with stakeholders and clients to understand business requirements and deliver innovative solutions in line with project goals and timelines. Mentor and provide technical leadership to junior developers and team members, fostering a culture of knowledge sharing and best practices. Act as the primary architect for Snowflake solutions, ensuring they are scalable, secure, and perform optimally. Travel to client locations as necessary to support project delivery and engage with stakeholders on-site. Skills and Qualifications: 10+ years of hands-on experience in Snowflake architecture, Azure technologies, and data management. Deep expertise in Snowflake architecture, data modeling, and data warehousing best practices. Extensive experience with Azure Data Services (Azure Data Lake, Azure Synapse Analytics, etc.) and Azure Data Factory (ADF). Strong experience in designing and developing ETL processes, ensuring high performance and scalability in cloud environments. Proficiency in SQL, data modeling, and working with both structured and semi-structured data. Strong understanding of data governance, data privacy, and security in cloud-based solutions. Proven ability to solve complex problems and optimize data workflows for large-scale cloud environments. Ability to collaborate and communicate effectively with both technical and non-technical teams. Bachelor's degree in Computer Science, Information Technology, or a related field. Excellent leadership and mentoring skills, with a proven track record of managing teams and projects successfully. Preferred Skills: Experience with other Azure services such as Azure SQL Database, Power BI, and Azure Synapse Analytics. Familiarity with data governance tools, data pipeline orchestration, and advanced data integration techniques. Strong expertise in performance tuning and query optimization in Snowflake. Why Join Us: Work with cutting-edge technologies like Snowflake, Azure, and ADF in the cloud data space. Opportunity to work on complex, large-scale data projects across diverse industries. Collaborative and innovative work environment that encourages continuous learning and professional growth. Competitive salary and benefits package. Opportunities for career advancement and leadership roles within the organization. Note: The candidate must be willing to travel to client locations as part of project delivery and engagement.
Posted 1 month ago
9 - 14 years
19 - 32 Lacs
Gurugram
Remote
ONLY Immediate Joiners Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Remote Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.
Posted 2 months ago
8 - 13 years
12 - 22 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings of The Day...!!! We have an URGENT on-rolls opening for the position of "Snowflake Architect" at One of our reputed clients for WFH. Name of the Company - Confidential Rolls - Onrolls Mode of Employment - FTE / Sub-Con / Contract Job Location - Remote Job Work Timings Night Shift – 06.00 pm to 03.00 am IST Nature of Work – Work from Home Working Days – 5 Days Weekly Educational Qualification - Bachelor's degree in computer science, BCA, engineering, or a related field. Salary – Maximum CTC Would be 23LPA (Salary & benefits package will be commensurate with experience and qualifications, PF, Medical Insurance cover available) Language Known - English, Hindi, & local language. Experience – 9 Years + of relevant experience in the same domain. Job Summary: We are seeking a highly skilled and experienced Snowflake Architect to lead the design, development, and implementation of scalable, secure, and high-performance data warehousing solutions on the Snowflake platform. The ideal candidate will possess deep expertise in data modelling, cloud architecture, and modern ELT frameworks. You will be responsible for architecting robust data pipelines, optimizing query performance, and ensuring enterprise-grade data governance and security. In this role, you will collaborate with data engineers, analysts, and business stakeholders to deliver efficient data solutions that drive informed decision-making across the organization. Key Responsibilities: Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Required Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Preferred Skills: Technical Expertise Cloud & Integration Performance & Optimization Security & Governance Soft Skills THE PERSON SHOULD BE WILLING TO JOIN IN 07-10 DAYS TIME OR IMMEDIATE JOINER. Request for interested candidates; Please share your updated resume with us below Email-ID executivehr@monalisammllp.com, also candidate can call or WhatsApp us at 9029895581. Current /Last Net in Hand - Salary will be offered based on the interview /Technical evaluation process -- Notice Period & LWD was/will be - Reason for Changing the job - Total Years of Experience in Specific Field – Please specify the location which you are from – Do you hold any offer from any other association - ? Regards, Monalisa Group of Services HR Department 9029895581 – Call / WhatsApp executivehr@monalisammllp.com
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough