Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 - 15.0 years
3 - 6 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
We are seeking a highly skilled Snowflake Architect with extensive experience in Snowflake, Azure, and Azure Data Factory (ADF) to join our growing team. This role will involve designing and implementing large-scale data solutions on the cloud platform, leading the architecture of the data ecosystem, and ensuring high levels of data integration, optimization, and governance. The ideal candidate will have a deep understanding of Snowflake data warehouse architecture and be ready to work closely with clients to deliver scalable data solutions. The role requires a leader who can guide teams and manage projects effectively while ensuring seamless integration of data systems across different platforms. Key Responsibilities: Lead the design and architecture of data solutions using Snowflake and Azure technologies. Oversee the development and optimization of ETL pipelines using Azure Data Factory (ADF), ensuring smooth data movement, transformation, and integration. Architect data workflows and ensure alignment with best practices for data governance, security, and compliance. Monitor, troubleshoot, and optimize data pipelines, ensuring performance, scalability, and reliability in Snowflake and Azure environments. Work closely with cross-functional teams to design end-to-end data solutions that meet business needs and ensure seamless integration with existing systems. Lead data validation efforts and ensure data integrity and quality across multiple systems. Collaborate with stakeholders and clients to understand business requirements and deliver innovative solutions in line with project goals and timelines. Mentor and provide technical leadership to junior developers and team members, fostering a culture of knowledge sharing and best practices. Act as the primary architect for Snowflake solutions, ensuring they are scalable, secure, and perform optimally. Travel to client locations as necessary to support project delivery and engage with stakeholders on-site. Skills and Qualifications: 10+ years of hands-on experience in Snowflake architecture, Azure technologies, and data management. Deep expertise in Snowflake architecture, data modeling, and data warehousing best practices. Extensive experience with Azure Data Services (Azure Data Lake, Azure Synapse Analytics, etc.) and Azure Data Factory (ADF). Strong experience in designing and developing ETL processes, ensuring high performance and scalability in cloud environments. Proficiency in SQL, data modeling, and working with both structured and semi-structured data. Strong understanding of data governance, data privacy, and security in cloud-based solutions. Proven ability to solve complex problems and optimize data workflows for large-scale cloud environments. Ability to collaborate and communicate effectively with both technical and non-technical teams. Bachelor's degree in Computer Science, Information Technology, or a related field. Excellent leadership and mentoring skills, with a proven track record of managing teams and projects successfully. Preferred Skills: Experience with other Azure services such as Azure SQL Database, Power BI, and Azure Synapse Analytics. Familiarity with data governance tools, data pipeline orchestration, and advanced data integration techniques. Strong expertise in performance tuning and query optimization in Snowflake. Why Join Us: Work with cutting-edge technologies like Snowflake, Azure, and ADF in the cloud data space. Opportunity to work on complex, large-scale data projects across diverse industries. Collaborative and innovative work environment that encourages continuous learning and professional growth. Competitive salary and benefits package. Opportunities for career advancement and leadership roles within the organization. Note: The candidate must be willing to travel to client locations as part of project delivery and engagement.
Posted 1 week ago
6.0 - 11.0 years
17 - 30 Lacs
Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 2 weeks ago
7 - 12 years
9 - 19 Lacs
Bengaluru, Hyderabad, Kolkata
Hybrid
Inviting applications for the role of Senior Principal Consultant- Snowflake Architect! In this role, the Snowflake Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities Should have experience in IT industry. Strong experience in building/designing the data warehouse, data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong understanding on Snowflake Architecture Able to create the design and data modelling independently. Able to create the high level and low-level design document based on requirement. Strong experience with building productionized data ingestion and data pipelines in Snowflake Should have prior experience as an Architect in interacting with customer and Team/Delivery leaders and vendor teams. Should have strong experience in migration/greenfield projects in Snowflake. Should have experience in implementing Snowflake best practices for network policy , Storage integration, data governance, cost optimization, resource monitoring, data ingestion, transformation, consumption layers Should have good exp on Snowflake RBAC and data security. Should have good experience in implementing strategy for CDC or SCD type 2 Strong experience in Snowflake features including new snowflake features. Should have good experience in Python. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience in DBT. Must to have Snowflake SnowPro Core or SnowPro Advanced Architect certification. Should have experience/knowledge in orchestration and scheduling tools experience. Should have good understanding on ETL processes and ETL tools. Good understanding of agile methodology. Good to have some understanding on GenAI. Good to have exposure to other databases such as Redshift, Databricks, SQL Server, Oracle, PostgreSQL etc. Able to create POCs, roadmaps, solution architecture, estimations & implementation plan Experience for Snowflake integrations with other data processing. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and strong experience as a Snowflake Architect. Skill Metrix: Snowflake, Python, AWS/Azure, Data Modeling, Design Patterns, DBT, ETL process and Data Warehousing concepts. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Preferred candidate profile Perks and benefits
Posted 2 months ago
12 - 18 years
35 - 40 Lacs
Chennai, Bengaluru, Trivandrum
Work from Office
Job Role: Sr. Data Architect - Snowflake Experience: 12+ years Notice period: Immediate Location: Trivandrum/Bangalore/Chennai We are looking for 12+years experienced candidates for this role: Job Description A minimum of 12+ years of experience in data engineering, encompassing the development and scaling of data warehouse and data lake platforms. Working hours - 8 hours, 12 PM Afternoon - 9 PM Responsibilities include: • Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability, performance,and reliability. • Collaborate with stakeholders to understand business requirements and translate them into technicalspecifications and data models. • Develop and maintain data architecture standards, guidelines, and best practices, including datagovernance principles and DataOps methodologies. • Oversee the implementation of data pipelines, ETL processes, and data governance frameworks withinSnowflake environments. • Provide technical guidance and mentorship to data engineering teams, fostering skill development and knowledge sharing. • Conduct performance tuning and optimization of Snowflake databases and queries.• Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data governance, and DataOps practices. Certifications: • Bachelors or Masters degree in Computer Science, Information Systems, or related field. • Certifications related Snowflake (e.g., SnowPro core/Snowpro advanced Architect/Snowpro advance Data Engineer ) are desirable but not mandatory. Primary Skills: • Snowflake experience, Data Architecture experience, ETL process experience, Large Data migration solutioning experience • Extensive experience in designing and implementing data solutions using Snowflake, DBT • Proficiency in data modeling, schema design, and optimization within Snowflake environments • Strong understanding of cloud data warehousing concepts and best practices, particularly with Snowflake • Expertise in Python/Java/Scala, SQL, ETL processes, and data integration techniques, with a focus on Snowflake • Familiarity with AWS, Azure, and GCP • Demonstrated experience in implementing data governance frameworks and DataOps practices• Working experience in SAP environments • Familiarity with real-time streaming technologies and Change Data Capture (CDC) mechanisms• Knowledge of data governance principles and DataOps methodologies • Proven track record of architecting and delivering complex data solutions in cloud platforms/Snowflake Secondary Skills: • Experience with data visualization tools (e.g., Tableau, Power BI) is a plus • Knowledge of data security and compliance standards • Excellent communication and presentation skills, with the ability to convey complex technical concepts to juniors and non-technical stakeholders • Strong problem-solving and analytical skills • Ability to work effectively in a collaborative team environment and lead cross-functional initiatives
Posted 3 months ago
8 - 13 years
12 - 22 Lacs
Bengaluru
Remote
Job Description: Snowflake Architect - Remote Opportunity (8+ Years of Experience) Position Overview: We are seeking an experienced and highly skilled Snowflake Architect to join our team in a fully remote capacity. As a Snowflake Architect, you will be responsible for designing, implementing, and optimizing data solutions leveraging the Snowflake Data Cloud. You will collaborate with cross-functional teams to deliver scalable, high-performance data architectures that support analytics, reporting, and data-driven decision-making. This is a strategic role that requires deep expertise in Snowflake, data warehousing, and cloud architecture, along with strong leadership and communication skills. Roles and Responsibilities: Solution Architecture: Design and implement end-to-end data architecture solutions using Snowflake. Define best practices for Snowflake architecture, including schema design, security, and performance optimization. Develop scalable data pipelines and workflows to process and transform data efficiently. Implementation and Development: Lead the migration of legacy data warehouses to Snowflake. Implement Snowflake features such as Virtual Warehouses, Secure Data Sharing, and Snowpipe for real-time data ingestion. Create and optimize SQL queries, stored procedures, and user-defined functions for analytics and reporting. Performance and Optimization: Monitor and improve Snowflake performance through workload management, clustering, and caching strategies. Optimize data models and query execution plans to ensure high performance and scalability. Integration and Automation: Integrate Snowflake with ETL/ELT tools such as Informatica, Talend, DBT, or Apache Airflow. Enable seamless data exchange with other platforms like AWS, Azure, GCP, and BI tools (e.g., Tableau, Power BI). Data Governance and Security: Implement robust data governance policies, including access control, data masking, and encryption. Ensure compliance with regulatory standards and best practices for data security. Collaboration and Leadership: Collaborate with data engineers, analysts, and business stakeholders to define requirements and deliver solutions. Provide technical guidance and mentorship to junior team members. Documentation and Support: Document data architecture, processes, and workflows to ensure clarity and maintainability. Provide ongoing support for Snowflake environments and troubleshoot issues as needed. Required Technical Skill Set: Snowflake Expertise: Strong experience with Snowflake Data Cloud (minimum 35 years). Proficiency in Snowflake features like SnowSQL, Time Travel, Data Sharing, and Materialized Views. Data Warehousing and Modeling: Extensive experience in data warehousing concepts, star and snowflake schemas, and dimensional modeling. Strong understanding of ETL/ELT processes and tools. Programming and Scripting: Proficiency in SQL and scripting languages like Python or Java. Experience with data orchestration frameworks like Apache Airflow or DBT. Cloud Platforms: Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud, and their integration with Snowflake. Performance Tuning: Expertise in query optimization, clustering strategies, and workload management in Snowflake. Data Integration: Experience integrating Snowflake with BI tools (e.g., Tableau, Power BI, Qlik) and data lakes. Preferred Skills and Certifications: Certifications: Snowflake SnowPro Core Certification (highly preferred). AWS Certified Data Analytics, Google Professional Data Engineer, or Azure Data Engineer certifications. Additional Skills: Familiarity with machine learning workflows using Snowflake. Knowledge of NoSQL databases and Big Data technologies like Hadoop or Spark. Required Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. 8+ years of experience in data architecture, with a focus on Snowflake and cloud data platforms.
Posted 3 months ago
9 - 14 years
19 - 32 Lacs
Gurugram
Remote
ONLY Immediate Joiners Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Remote Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.
Posted 1 month ago
8 - 13 years
12 - 22 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Greetings of The Day...!!! We have an URGENT on-rolls opening for the position of "Snowflake Architect" at One of our reputed clients for WFH. Name of the Company - Confidential Rolls - Onrolls Mode of Employment - FTE / Sub-Con / Contract Job Location - Remote Job Work Timings Night Shift – 06.00 pm to 03.00 am IST Nature of Work – Work from Home Working Days – 5 Days Weekly Educational Qualification - Bachelor's degree in computer science, BCA, engineering, or a related field. Salary – Maximum CTC Would be 23LPA (Salary & benefits package will be commensurate with experience and qualifications, PF, Medical Insurance cover available) Language Known - English, Hindi, & local language. Experience – 9 Years + of relevant experience in the same domain. Job Summary: We are seeking a highly skilled and experienced Snowflake Architect to lead the design, development, and implementation of scalable, secure, and high-performance data warehousing solutions on the Snowflake platform. The ideal candidate will possess deep expertise in data modelling, cloud architecture, and modern ELT frameworks. You will be responsible for architecting robust data pipelines, optimizing query performance, and ensuring enterprise-grade data governance and security. In this role, you will collaborate with data engineers, analysts, and business stakeholders to deliver efficient data solutions that drive informed decision-making across the organization. Key Responsibilities: Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Required Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Preferred Skills: Technical Expertise Cloud & Integration Performance & Optimization Security & Governance Soft Skills THE PERSON SHOULD BE WILLING TO JOIN IN 07-10 DAYS TIME OR IMMEDIATE JOINER. Request for interested candidates; Please share your updated resume with us below Email-ID executivehr@monalisammllp.com, also candidate can call or WhatsApp us at 9029895581. Current /Last Net in Hand - Salary will be offered based on the interview /Technical evaluation process -- Notice Period & LWD was/will be - Reason for Changing the job - Total Years of Experience in Specific Field – Please specify the location which you are from – Do you hold any offer from any other association - ? Regards, Monalisa Group of Services HR Department 9029895581 – Call / WhatsApp executivehr@monalisammllp.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2