Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a skilled Snowflake Developer with over 7 years of experience, you will be responsible for designing, developing, and optimizing Snowflake data solutions. Your expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration will be crucial in building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Your key responsibilities will include: - Designing and developing Snowflake databases, schemas, tables, and views following best practices. - Writing complex SQL queries, stored procedures, and UDFs for data transformation. - Optimizing query performance using clustering, partitioning, and materialized views. - Implementing Snowflake features such as Time Travel, Zero-Copy Cloning, Streams & Tasks. - Building and maintaining ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. - Integrating Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). - Developing CDC (Change Data Capture) and real-time data processing solutions. - Designing star schema, snowflake schema, and data vault models in Snowflake. - Implementing data sharing, secure views, and dynamic data masking. - Ensuring data quality, consistency, and governance across Snowflake environments. - Monitoring and optimizing Snowflake warehouse performance (scaling, caching, resource usage). - Troubleshooting data pipeline failures, latency issues, and query bottlenecks. - Collaborating with data analysts, BI teams, and business stakeholders to deliver data solutions. - Documenting data flows, architecture, and technical specifications. - Mentoring junior developers on Snowflake best practices. Required Skills & Qualifications: - 7+ years in database development, data warehousing, or ETL. - 4+ years of hands-on Snowflake development experience. - Strong SQL or Python skills for data processing. - Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). - Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). - Certifications: SnowPro Core Certification (preferred). Preferred Skills: - Familiarity with data governance and metadata management. - Familiarity with DBT, Airflow, SSIS & IICS. - Knowledge of CI/CD pipelines (Azure DevOps).,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,
Posted 2 days ago
5.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education
Posted 4 days ago
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad, Pune, Mumbai (All Areas)
Work from Office
Job Description - Snowflake Developer Experience: 7+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are looking for a Snowflake Developer with 7+ years of experience to design, develop, and maintain our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake SQL, data modeling, and ETL/ELT processes to build efficient and scalable data solutions. Key Responsibilities 1. Snowflake Development & Implementation Design and develop Snowflake databases, schemas, tables, and views Write and optimize complex SQL queries, stored procedures, and UDFs Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks) Manage virtual warehouses, resource monitors, and cost optimization 2. Data Pipeline & Integration Build and maintain ETL/ELT pipelines using Snowflake and tools like Snowpark, Python, or Spark Integrate Snowflake with cloud storage (S3, Blob Storage) and data sources (APIs) Develop data ingestion processes (batch and real-time) using Snowpipe 3. Performance Tuning & Optimization Optimize query performance through clustering, partitioning, and indexing Monitor and troubleshoot data pipelines and warehouse performance Implement caching strategies and materialized views for faster analytics 4. Data Modeling & Governance Design star schema, snowflake schema, and normalized data models Implement data security (RBAC, dynamic data masking, row-level security) Ensure data quality, documentation, and metadata management 5. Collaboration & Support Work with analysts, BI teams, and business users to deliver data solutions Document technical specifications and data flows Provide support and troubleshooting for Snowflake-related issues Required Skills & Qualifications 7+ years in database development, data warehousing, or ETL 3+ years of hands-on Snowflake development experience Strong SQL and scripting (Python, Bash) skills Experience with Snowflake utilities (SnowSQL, Snowsight) Knowledge of cloud platforms (AWS, Azure) and data integration tools SnowPro Core Certification (preferred but not required) Experience with Coalesce DBT , Airflow, or other data orchestration tools Familiarity with CI/CD pipelines and DevOps practices Knowledge of data visualization tools (Power BI, Tableau)
Posted 1 week ago
5.0 - 10.0 years
6 - 16 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 1 week ago
3.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL ELT tools like Nifi, Matallion DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment evaluate adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment.
Posted 1 week ago
6.0 - 10.0 years
5 - 8 Lacs
Greater Noida
Work from Office
Job Description- • Experience on implementing Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns • Proficiency in RDBMS, complex SQL, PL/SQL, performance tuning and troubleshoot • Provide resolution to an extensive range of complicated data pipeline related problems • Experience in Data Migration from RDBMS to Snowflake cloud data warehouse • Experience with data security and data access controls and design • Build processes supporting data transformation, data structures, metadata, dependency & workload management • Experience in Snowflake modelling - roles, schema, databases. • Extensive hands-on expertise with Creation of Stored Procedures and Advance SQL. • Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models. • Develop and enforce best practices for version control, testing, and documentation of DBT models. • Build and manage data quality checks and validation processes within the DBT pipelines. • Ability to optimize SQL queries for performance and efficiency. • Good to have experience in Azure services such as ADF, Databricks, Data pipeline building. • Excellent analytical and problem-solving skills. • Have working experience in an Agile methodology. • Knowledge of DevOps processes (including CI/CD) , PowerBI • Excellent communication skills.
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You should have a strong combination of ETL and Automation QA skills with a focus on scripting in Python or Java Selenium. As a QA professional with 2-4 years of experience, you must be proficient in SQL or SnowSQL, ETL, and database testing. Your technical expertise should include a solid understanding of programming languages like Java or Python, along with automation frameworks. You should have hands-on experience in creating and running Python-based automation scripts and developing end-to-end automated testing solutions using Python. Your expertise in automation frameworks such as Selenium, PyTest, or Robot Framework will be valuable. It would be beneficial to have experience with CI/CD pipelines and integrating automation scripts into them, as well as exposure to Big Data platforms and Azure environments. A good understanding of Agile methodology and experience using tools like JIRA is essential. Your additional responsibilities will include reviewing requirements, creating test plans, and estimations. You will be responsible for planning, authoring, and executing test cases for compliance products in client projects, as well as performing test iterations, defect tracking, and result reporting. Collaboration with developers, technical leads, and stakeholders throughout the Software Development Life Cycle (SDLC) is crucial. You will need to coordinate with clients and development teams to resolve any roadblocks effectively. Regular documentation, tracking, and escalation of issues/feedback are part of your responsibilities. Running both manual and automated test scripts and identifying areas for automation are key tasks. You should proactively seek managerial support for risks and mitigation plans to ensure the successful implementation of testing processes and procedures.,
Posted 1 week ago
8.0 - 13.0 years
15 - 25 Lacs
Chennai, Coimbatore, Bengaluru
Work from Office
Role Overview: We are looking for a skilled Data Engineer with hands-on expertise in Snowflake, Python, and AWS services to support end-to-end data migration and warehousing solutions. The ideal candidate will design, develop, and maintain scalable Snowflake data structures, implement efficient ETL pipelines using Python and AWS (S3, Lambda, Step Functions), and collaborate with cross-functional teams to deliver high-performance data solutions. Proficiency in Power BI for dashboarding and data visualization is essential. Experience in healthcare data is a plus. Key Responsibilities: • Design, develop, test, deploy, and maintain Snowflake tables/views/stored procedures/streams/tasks using SnowSQL. • Develop ETL processes using Python and AWS services such as S3, Lambda, Step Functions to migrate data from various sources into Snowflake. • SQL Expertise: Writing optimized queries for data extraction from Snowflake. Experience in data modeling: Star schema, snowflake schema, and dimensional modeling. • Collaborate with cross-functional teams to gather requirements and design solutions for complex data warehousing projects. • Troubleshoot issues related to SnowPipe performance optimization and implement improvements. • Ensure high availability of the system by implementing redundancy measures. • Proficiency in Power BI: Dashboard creation, data visualization, DAX, Power Query and SQL (T-SQL, Stored Procedures). Job Requirements: • Strong understanding of Data Migration concepts including ETL process development using Python & AWS Services like S3, Lambda, Step Functions. HealthCare experience is an additional advantage. • Strong proficiency in ReactJS, NodeJS, NextJS, and VueJS. • Experience in leading teams as a Tech Lead or Engineering Lead for at least 2 years. • Solid understanding of RESTful APIs, microservices architecture, and web application security. • Good exposure to CI/CD, version control systems (Git), and containerization tools (Docker/Kubernetes is a plus). • Excellent problem-solving, communication, and leadership skills. • Bachelor's or Masters degree in Computer Science, Engineering, or a related field. Ready to lead with technology? Send your resume to Praveen.thangavel@concertidc.com.
Posted 1 week ago
4.0 - 6.0 years
20 - 25 Lacs
Noida, Pune, Chennai
Work from Office
We are seeking a skilled and detail-oriented Data Engineer with 4 to 6 years of hands-on experience in Microsoft Fabric , Snowflake , and Matillion . The ideal candidate will play a key role in supporting MS Fabric and migrating from MS fabric to Snowflake and Matillion. Roles and Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Matillion and integrate data from various sources. Architect and optimize Snowflake data warehouses, ensuring efficient data storage, querying, and performance tuning. Leverage Microsoft Fabric for end-to-end data engineering tasks, including data ingestion, transformation, and reporting. Collaborate with data analysts, scientists, and business stakeholders to deliver high-quality, consumable data products. Implement data quality checks, monitoring, and observability across pipelines. Automate data workflows and support CI/CD practices for data deployments. Troubleshoot performance bottlenecks and data pipeline failures with a root-cause analysis mindset. Maintain thorough documentation of data processes, pipelines, and architecture. trong expertise with: Microsoft Fabric (Dataflows, Pipelines, Lakehouse, Notebooks, etc.) Snowflake (warehouse sizing, SnowSQL, performance tuning) Matillion (ETL/ELT orchestration, job optimization, connectors) Proficiency in SQL and data modeling (dimensional/star schema, normalization). Experience with Python or other scripting languages for data manipulation. Familiarity with version control tools (e.g., Git) and CI/CD workflows. Solid understanding of cloud data architecture (Azure preferred). Strong problem-solving and debugging skills.
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.
Posted 1 week ago
6.0 - 10.0 years
15 - 25 Lacs
Hyderabad, Chennai
Hybrid
Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Hyderabad
Hybrid
Job Title: Snowflake Developer Experience: 5+ years Location: Hyderabad (Hybrid) Job Type: Full-time About Us: We're seeking an experienced Snowflake Developer to join our team in Pune and Hyderabad. As a Snowflake Developer, you will be responsible for designing, developing, and implementing data warehousing solutions using Snowflake. You will work closely with cross-functional teams to ensure seamless data integration and analytics. Key Responsibilities: Design, develop, and deploy Snowflake-based data warehousing solutions Collaborate with stakeholders to understand data requirements and develop data models Optimize Snowflake performance, scalability, and security Develop and maintain Snowflake SQL scripts, stored procedures, and user-defined functions Troubleshoot data integration and analytics issues Ensure data quality, integrity, and compliance with organizational standards Work with data engineers, analysts, and scientists to ensure seamless data integration and analytics Stay up-to-date with Snowflake features and best practices Requirements: 5+ years of experience in Snowflake development and administration Strong expertise in Snowflake architecture, data modeling, and SQL Experience with data integration tools (e.g., Informatica, Talend, Informatica PowerCenter) Proficiency in Snowflake security features and access control Strong analytical and problem-solving skills Excellent communication and collaboration skills Experience working in hybrid or remote teams Bachelor's degree in Computer Science, Engineering, or related field Nice to Have: Experience with cloud platforms (AWS, Azure, GCP) Knowledge of data governance and data quality frameworks Experience with ETL/ELT tools (e.g., Informatica PowerCenter, Talend, Microsoft SSIS) Familiarity with data visualization tools (e.g., Tableau, Power BI) Experience working with agile methodologies
Posted 1 week ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Hybrid
3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 1 week ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining Birlasoft, a global leader in Cloud, AI, and Digital technologies, known for seamlessly blending domain expertise with enterprise solutions. As part of the diversified CKA Birla Group, with over 12,000 professionals, you will contribute to the Groups 170-year heritage of building sustainable communities. We are currently seeking a Matillion Lead with 6-10 years of experience to join our team in PAN India. The ideal candidate should hold a B.E/B.Tech degree and have the following skills and qualifications: - Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. - 10+ years of experience in DWH, with 2-4 years of experience in implementing DWH on Snowflake using Matillion. - Design, develop, and maintain ETL processes using Matillion for data extraction, transformation, and loading into Snowflake. - Collaborate with data architects and business stakeholders to understand data requirements and provide technical solutions. - Lead the end-to-end system and architecture design for applications and infrastructure. - Conduct data validation, end-to-end testing of ETL objects, source data analysis, and data profiling. - Troubleshoot and resolve issues related to Matillion development and data integration. - Work with business users to create architecture aligned with business needs. - Develop project requirements for end-to-end data integration processes using ETL for structured, semi-structured, and unstructured data. - Strong understanding of ELT/ETL and integration concepts, as well as design best practices. - Experience in performance tuning of Matillion Cloud data pipelines and quick issue resolution. - Knowledge of Snowsql and Snowpipe will be an added advantage. - Follow best practices for Matillion development to ensure maintainability, scalability, and reusability of ETL processes. In addition to the technical skills, the ideal candidate should possess excellent presentation and communication skills to interact with both technical and non-technical stakeholders. Resilience and the ability to thrive in a dynamic, fast-paced work environment are also essential for this role.,
Posted 1 week ago
6.0 - 11.0 years
10 - 20 Lacs
Noida, Hyderabad, Pune
Work from Office
Minimum 6 years of experience in Snowflake development, including data modeling, performance tuning, and ELT pipelines. Strong proficiency in writing complex SQL queries, Snowflake procedures, and working with cloud data platforms.
Posted 2 weeks ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Hybrid
Job Title:Snowflake Developer Experience: 6-8 Years Location:Chennai - Hybrid Job Description : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure, GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are an experienced BI Architect with a strong background in Power BI and the Microsoft Azure ecosystem. Your main responsibility will be to design, implement, and enhance business intelligence solutions that aid in strategic decision-making within the organization. You will play a crucial role in leading the BI strategy, architecture, and governance processes, while also guiding a team of BI developers and Data analysts. Your key responsibilities will include designing and implementing scalable BI solutions using Power BI and Azure services, defining BI architecture, data models, security models, and best practices for enterprise reporting. You will collaborate closely with business stakeholders to gather requirements and transform them into data-driven insights. Additionally, you will oversee data governance, metadata management, and Power BI workspace design, optimizing Power BI datasets, reports, and dashboards for performance and usability. Furthermore, you will be expected to establish standards for data visualization, development lifecycle, version control, and deployment. As a mentor to BI developers, you will ensure adherence to coding and architectural standards, integrate Power BI with other applications using APIs, Power Automate, or embedded analytics, and monitor and troubleshoot production BI systems to maintain high availability and data accuracy. To qualify for this role, you should have a minimum of 12 years of overall experience with at least 7 years of hands-on experience with Power BI, including expertise in data modeling, DAX, M/Power Query, custom visuals, and performance tuning. Strong familiarity with Azure services such as Azure SQL Database, Azure Data Lake, Azure Functions, and Azure DevOps is essential. You must also possess a solid understanding of data warehousing, ETL, and dimensional modeling concepts, along with proficiency in SQL, data transformation, and data governance principles. Experience in managing enterprise-level Power BI implementations with large user bases and complex security requirements, excellent communication and stakeholder management skills, the ability to lead cross-functional teams, and influence BI strategy across departments are also prerequisites for this role. Knowledge of Microsoft Fabric architecture and its components, a track record of managing BI teams of 6 or more, and the capability to provide technical leadership and team development are highly desirable. In addition, having the Microsoft Fabric Certification DP 600 and PL-300 would be considered a bonus for this position.,
Posted 2 weeks ago
6.0 - 11.0 years
20 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data Ensure data quality in a big data environment Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach Desirable to have Talend / Snowflake Certification Excellent SQL coding skills, excellent communication, and documentation skills Familiar with Agile delivery process. Must be analytical, creative, and self-motivate Work effectively within a global team environment Excellent communication skills.
Posted 2 weeks ago
6.0 - 11.0 years
7 - 17 Lacs
Gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 3 weeks ago
8.0 - 13.0 years
35 - 45 Lacs
Noida, Pune, Bengaluru
Hybrid
Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero-copy clone, time travel, and automating them. Experience in handling semi-structured data (JSON, XML), and columnar PARQUET using the VARIANT attribute in Snowflake. Experience in re-clustering the data in Snowflake with a good understanding of Micro-Partitions. Experience in Migration processes to Snowflake from an on-premises database environment. Experience in designing and building manual or auto-ingestion data pipelines using Snowpipe. SnowSQL experience in developing stored procedures and writing queries to analyze and transform data. Must have skills - Certified Snowflake Architect, Snowflake Architecture, Snow Pipes, SnowSQL, SQL, CI/CD and Python Perks and benefits Competitive compensation package. Opportunity to work with industry leaders. Collaborative and innovative work environment. Professional growth and development opportunities.
Posted 3 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem
Posted 3 weeks ago
10.0 - 17.0 years
35 - 45 Lacs
Pune, Bangalore Rural, Chennai
Work from Office
We are search and staffing firm, catering to the hiring needs of top most IT Giants, Products and Captive. One such client which is a leading IT services company in healthcare domain is looking for: Job Summary: We are looking for an experienced Snowflake Architect to lead the design and implementation of scalable and high-performance data platforms on Snowflake. The ideal candidate should have a strong background in data architecture, data modeling, cloud platforms, and data warehousing with deep expertise in Snowflake. Key Responsibilities: Architect and design Snowflake-based data warehouse solutions for large-scale enterprise environments. Collaborate with business and data teams to understand data requirements and translate them into scalable architecture. Define best practices for data loading, transformation (ELT/ETL), and performance tuning in Snowflake. Lead the migration of existing data warehouses (e.g., Teradata, Oracle, Netezza) to Snowflake. Implement security and governance practices including role-based access control, masking, and data auditing. Optimize Snowflake storage and compute usage to ensure cost-effective solutions. Develop reusable frameworks and automation for data ingestion and transformation using tools like DBT, Airflow, or custom scripts. Guide and mentor development teams on Snowflake usage, query optimization, and design standards. Evaluate emerging data technologies and provide recommendations for adoption. Required Skills and Experience: 10+ years in data architecture, data engineering, or related roles. 3+ years of hands-on experience in Snowflake data platform. Expertise in SQL, performance tuning, and data modeling (3NF, dimensional, etc.). Experience with ELT tools (e.g., Informatica, Talend, DBT, Matillion, etc.). Strong understanding of cloud platforms (AWS, Azure, or GCP) and native integration with Snowflake. Familiarity with Python or scripting for automation and orchestration. Knowledge of CI/CD pipelines and version control using Git. Excellent communication and stakeholder management skills. Please Note: We are looking for candidates who can join on immediate basis, also please be assured your resume shall be highly confidential and will be only taken ahead after your consent.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Work Location :Bangalore, chennai, Hyderabad, Pune, Bhubaneshwar, Kochi Experience :5-10yrs Job Description: Hands on experience in Snowflake Experience in Snowpipe, Snowsql Strong datawarehouse experience Please share your updated profile to suganya@spstaffing.in, if you are actively looking for change.
Posted 3 weeks ago
0.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consulta nt- Sr. Snowflake Data Engineer ( Python+Cloud ) ! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : E xperience in IT industry W orking experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL /ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification is must . Roles and Responsibilities : Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake , developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snow p ipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs , Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system . Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python / Pyspark . integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security . Should have good experience in implementing CDC or SCD type - 2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analys is, designing, development, and deployment . Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer . Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough