Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
9 - 18 Lacs
hyderabad, chennai, bengaluru
Work from Office
We have an opportunity AWS Snowflake DBT Developer in PWC AC Position: AWS Snowflake DBT Developer Experie nce Required: 4-8 Years Notice Period: Immediate to 60 Days Locations: Bangalore, Hyderabad, Kolkata, Chennai, Pune, Gurgaon & Mumbai Work Mode: Hybrid Must Have Skills: 1) AWS 2) Data bricks 3) Snowflake data warehousing, including SQL, Snow pipe 4) SnowSQL, SnowPipe 5) Data Build tool Must Have : Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,ETL data Pipelines, Big Data model techniques using Python / Java Experience in loading disparate data sets and translating complex functional and technical requirements into detailed design Should be aware of deploying Snowflake features such as data sharing, events and lake-house patterns Should have experience with data security and data access controls and design Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modeling) Proficient in Lambda and Kappa Architectures Strong AWS hands-on expertise with a programming background preferably Python/Scala Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory Strong experience in AWS compute services like AWS EMR, Glue and Sagemaker and storage services like S3, Redshift & Dynamodb Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming Experience in one of the flow tools like Airflow, Nifi or Luigi Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Strong understanding of Cloud data migration processes, methods and project lifecycle Good analytical & problem-solving skills Good communication and presentation skills Desired Knowledge / Skills: Experience in building stream-processing systems, using solutions such as Storm or Spark-Streaming Experience in Big Data ML toolkits, such as Mahout, SparkML, or H2O Knowledge in Python Worked in Offshore / Onsite Engagements Experience in one of the flow tools like Airflow, Nifi or Luigi Experience in AWS services like STEP & Lambda Interested candidates kindly share profiles on dhamma.b.bhawsagar@pwc.com MB 8459621589
Posted 15 hours ago
6.0 - 10.0 years
9 - 13 Lacs
mumbai, bengaluru
Work from Office
Job Title : Snowflake Developer with Oracle Golden Gate/ Data Engineer About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals Responsibilities: Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology. Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion. Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions. Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption. CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes. Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance. Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines. Mandatory Skills: Should have 4 years of experience as Data Engineer Strong expertise in Snowflake architecture, data modeling, and query optimization . Proficiency in SQL for writing and optimizing complex queries. Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication. Knowledge of Snowpipe for automated data ingestion. Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake. Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows. Working knowledge of Snowflake Data Vault methodology . Good to Have Skills: Exposure to Databricks for data processing and analytics. Knowledge of Python or Scala for data engineering tasks. Familiarity with Terraform or CloudFormation for infrastructure as code (IaC). Experience in data governance and compliance best practices . Understanding of ML and AI integration with data pipelines . Self-Test Questions: Do I have hands-on experience in designing and optimizing Snowflake data models? Can I confidently set up and manage real-time data replication using Oracle GoldenGate? Have I worked with Snowpipe to automate data ingestion processes? Am I proficient in SQL and capable of writing optimized queries in Snowflake? Do I have experience integrating Snowflake with AWS cloud services? Have I implemented CI/CD pipelines for Snowflake development? Can I troubleshoot performance issues in Snowflake and optimize queries effectively? Have I documented data engineering processes and best practices for team collaboration?
Posted 19 hours ago
3.0 - 8.0 years
5 - 15 Lacs
hyderabad, chennai, bengaluru
Work from Office
Educational Requirements Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 1 day ago
3.0 - 8.0 years
5 - 8 Lacs
pune, chennai, mumbai (all areas)
Hybrid
Educational Requirements Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
Role Overview: As an AWS Senior Developer at PwC - AC, you will collaborate with the Offshore Manager and Onsite Business Analyst to comprehend the requirements and take charge of implementing end-to-end Cloud data engineering solutions such as Enterprise Data Lake and Data hub in AWS. Your role will involve showcasing strong proficiency in AWS cloud technology, exceptional planning and organizational skills, and the ability to work as a cloud developer/lead in an agile team to deliver automated cloud solutions. Key Responsibilities: - Proficient in Azure Data Services, Databricks, and Snowflake - Deep understanding of traditional and modern data architecture and processing concepts - Proficient in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines using Python / Java - Building stream-processing systems with solutions like Storm or Spark-Streaming - Designing and implementing scalable ETL/ELT pipelines using Databricks and Apache Spark - Optimizing data workflows and ensuring efficient data processing - Understanding big data use-cases and Lambda/Kappa design patterns - Implementing Big Data solutions using Microsoft Data Platform and Azure Data Services - Exposure to Open-Source technologies such as Apache Spark, Hadoop, NoSQL, Kafka, Solr/Elastic Search - Driving adoption and rollout of Power BI dashboards for finance stakeholders - Well-versed with quality processes and implementation - Experience in Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab - Guiding the assessment and implementation of finance data marketplaces - Good communication and presentation skills Qualifications Required: - BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA - Certification on AWS Architecture desirable - Experience in building stream-processing systems with Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Worked in Offshore / Onsite Engagements - Experience in AWS services like STEP & Lambda - Good Project Management skills with consulting experience in Complex Program Delivery - Good to have knowledge in Cloud - AWS, GCP, Informatica-Cloud, Oracle-Cloud, Cloud DW - Snowflake & DBT Additional Details: Travel Requirements: Travel to client locations may be required as per project requirements. Line of Service: Advisory Horizontal: Technology Consulting Designation: Senior Associate Location: Anywhere in India Apply now if you believe you can contribute to PwC's innovative and collaborative environment, and be a part of a globally recognized firm dedicated to excellence and diversity.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
bangalore, karnataka
On-site
Role Overview: You will be responsible for architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions. Your role will involve designing scalable data architectures with Snowflake, integrating cloud technologies such as AWS, Azure, GCP, and ETL/ELT tools like DBT. Additionally, you will guide teams in proper data modeling, transformation, security, and performance optimization. Key Responsibilities: - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Design scalable data architectures with Snowflake and integrate cloud technologies like AWS, Azure, GCP, and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Load data from disparate data sets and translate complex functional and technical requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Implement data security and data access controls and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Utilize AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Implement Lambda and Kappa Architectures - Utilize Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Utilize AWS compute services like AWS EMR, Glue, and Sagemaker, as well as storage services like S3, Redshift, and DynamoDB - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Spark core, SQL, and Spark Streaming - Experience with flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Qualifications Required: - 8-12 years of relevant experience - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python/Java - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Proficiency in AWS, Data bricks, and Snowflake data warehousing, including SQL, Snow pipe - Experience in data security, data access controls, and design - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Good knowledge of Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Good experience with AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, SQL, and Spark Streaming - Experience in one of the flow tools like Airflow, Nifi, or Luigi - Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Kindly share your profiles on dhamma.b.bhawsagar@pwc.com if you are interested in this opportunity.,
Posted 3 days ago
4.0 - 9.0 years
5 - 15 Lacs
ahmedabad, chennai, bengaluru
Work from Office
Educational Requirements Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 4 days ago
9.0 - 14.0 years
20 - 35 Lacs
mumbai, pune, chennai
Work from Office
Were Hiring: Snowflake Developer Primary Skills: Snowflake, Snow SQL, Snow Pipe, SQL Location: Chennai / Mumbai / Pune Apply/Share your profile: VimalaP3@hexaware.com / https://forms.office.com/r/sU5bhsquHS Experience Required: 8+ years of experience as a Snowflake developer or data engineer with a focus on data warehousing and ETL Snowflake certification(s) is a plus Key Skills: Strong SQL skills and proficiency in data modeling and database design Knowledge of cloud data warehousing concepts and best practices Familiarity with data integration tools and technologies Solid understanding of data governance, data security, and compliance requirements Experience with version control systems and deployment processes Excellent problem-solving and troubleshooting skills Strong communication and collaboration abilities Ability to work in an Agile or iterative development environment Responsibilities: Collaborate with data architects to design and develop Snowflake data models and schemas Write complex SQL queries, stored procedures, and user-defined functions (UDFs) to support data analytics and reporting needs Ensure SQL code follows best practices for readability and performance Develop ETL (Extract, Transform, Load) processes to ingest data from various sources into Snowflake Design and implement data pipelines using Snowflake features like tasks and streams Strong knowledge in Snow Pipe and Snow SQL #Hiring #Snowflake #DataEngineer #SQL #ETL #DataWarehousing #Careers
Posted 4 days ago
5.0 - 10.0 years
15 - 30 Lacs
pune, gurugram, bengaluru
Work from Office
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, SQL Queries, Snowpipe, Snowsql, Performance Tuning, Query Optimization, Stored procedures, DBT, Matillion and ETL. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Snowflake Utilities , Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities: Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications: Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.
Posted 5 days ago
8.0 - 13.0 years
32 - 45 Lacs
kolkata, hyderabad, bengaluru
Hybrid
We have an opportunity Snowflake + Dremio + Databricks in PWC AC Position: Snowflake + Dremio + Databricks Experience Required: 8-12 Years Notice Period: Immediate to 60 Days Locations: Bangalore, Hyderabad, Kolkata, Chennai, Pune, Gurgaon & Mumbai Work Mode: Hybrid Education: BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Must Have Skills: 1) Azure Data Services 2) Data bricks 3) Snowflake data warehousing, including SQL, Snow pipe 4) SnowSQL, SnowPipe 5) Dremio Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ,ETL data Pipelines, Big Data model techniques using Python / Java Proven experience architecting and implementing large-scale data solutions on Snowflake, including data ingestion, transformation, and optimization. Proficiency in Azure Databricks, including Spark architecture and optimization. Experience migrating data from relational databases to Snowflake and optimizing Snowflakes features such as data sharing, events, and lakehouse patterns. Experience in Data Migration from RDBMS to Snowflake cloud data warehouse Hands-on expertise with Dremio for data lake query acceleration, data virtualization, and managing diverse data formats (e.g., JSON, XML, CSV).Handling large and complex sets of XML, JSON, and CSV from various sources and databases Rich experience working in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services Experience in load from disparate data sets and translate complex functional and technical requirements into detailed design Knowledge of data security, access controls, and governance within cloud-native data platforms like Snowflake and Dremio. Exposure to cloud AWS, Azure or GCP data storage and management technologies such as S3, Blob/ADLS and Google Cloud Storage Should have a good understanding of Data Quality processes, methods and project lifecycle. Experience validating the ETL and writing SQL queries Strong knowledge in DWH/ODS, ETL concept and modeling structure principles Should have clear understanding of DW Lifecycle and contributed in preparing technical design documents and test plans Interested candidates kindly share profiles on dhamma.b.bhawsagar@pwc.com MB 8459621589
Posted 5 days ago
6.0 - 10.0 years
8 - 13 Lacs
hyderabad
Work from Office
Job description Position Title: Data Engineer (Snowflake Lead) Experience: 7+ Years Shift Schedule: Rotational Shifts Location: Hyderabad Role Overview We are seeking an experienced Data Engineer with strong expertise in Snowflake to join the Snowflake Managed Services team. This role involves data platform development, enhancements, and production support across multiple clients. You will be responsible for ensuring stability, performance, and continuous improvement of Snowflake environments. Key Responsibilities Design, build, and optimize Snowflake data pipelines, data models, and transformations. Provide L2/L3 production support for Snowflake jobs, queries, and integrations. Troubleshoot job failures, resolve incidents, and perform root cause analysis (RCA). Monitor warehouses, tune queries, and optimize Snowflake performance and costs. Manage service requests such as user provisioning, access control, and role management. Create and maintain documentation, runbooks, and standard operating procedures. Required Skills & Experience 5+ years of hands-on experience in Snowflake development and support. Strong expertise in SQL, data modeling, and query performance tuning. Experience with ETL/ELT development and orchestration tools (e.g., Azure Data Factory). Familiarity with CI/CD pipelines and scripting (Python or PySpark). Strong troubleshooting and incident resolution skills. Preferred Skills SnowPro Core Certification. Experience with ticketing systems (ServiceNow, Jira). Hands-on experience with Azure cloud services. Knowledge of ITIL processes.
Posted 6 days ago
3.0 - 8.0 years
10 - 19 Lacs
pune, bengaluru, mumbai (all areas)
Hybrid
CitiusTech is conducting drive this weekend on 13th Sep-25 for the below skill. Required Skillset: Snowfalke+Matillion or Snowflake+ADF. Total Years of experience: 3 to 12 years. Relevant experience: minimum 3 to 4 years. Work Location: Chennai, Mumbai, Pune, Bengaluru. Work Mode: Hybrid. Inerview Mode: Virtual. Interview Date: 13th Sep-25. Interested candidates kindly share your updated resume to gopinath.r@citiustech.com. Thanks & Regards, Gopinath R.
Posted 6 days ago
8.0 - 13.0 years
15 - 30 Lacs
pune, chennai, bengaluru
Hybrid
Role & responsibilities Snowflake JD Must have 8- 15 years of experience in Data warehouse, ETL, BI projects • Must have at least 5+ years of experience in Snowflake, 3+DBT • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable
Posted 1 week ago
6.0 - 11.0 years
7 - 17 Lacs
gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 1 week ago
4.0 - 8.0 years
5 - 15 Lacs
bengaluru
Work from Office
Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills : Snowflake/Azure Data Factory Required Skills Snowflake AND SQL
Posted 1 week ago
7.0 - 12.0 years
20 - 25 Lacs
noida
Remote
Job Title: Snowflake Solution Architect Experience: 7+ Years Location: 100% Remote Job Type: Contract Experience with Agile-based development Problem-solving skills Proficiency in writing performant SQL Queries/Scripts to generate business insights and drive better organizational decision-making. Job Summary: We are seeking a skilled and detail-oriented Snowflake Developer to design, develop, and maintain scalable data solutions using the Snowflake platform. The ideal candidate will have experience in data warehousing, ETL/ELT processes, and cloud-based data architecture. Key Responsibilities: • Design and implement data pipelines using Snowflake, SQL, and ETL tools. • Develop and optimize complex SQL queries for data extraction and transformation. • Create and manage Snowflake objects such as databases, schemas, tables, views, and stored procedures. • Integrate Snowflake with various data sources and third-party tools. • Monitor and troubleshoot performance issues in Snowflake environments. • Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. • Ensure data quality, security, and governance standards are met. • Automate data workflows and implement best practices for data management. Required Skills and Qualificationxs: • Proficiency in Snowflake SQL and Snowflake architecture. • Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion). • Strong knowledge of cloud platforms (AWS, Azure, or GCP). • Familiarity with data modeling and data warehousing concepts. • Experience with Python, Java, or Shell scripting is a plus. • Understanding of data security, role-based access control, and data sharing in Snowflake. • Excellent problem-solving and communication skills. Preferred Qualifications: • Snowflake certification (e.g., SnowPro Core). • Experience with CI/CD pipelines and DevOps practices. • Knowledge of BI tools like Tableau, Power BI, or Looker. Share resume to: Vishal Kumar Vishal@akaasa.com
Posted 1 week ago
5.0 - 7.0 years
12 - 20 Lacs
pune
Work from Office
About the Role We are looking for an experienced Data Engineer to lead the migration of our data platform from Amazon Redshift to Snowflake. This role involves re-engineering existing data logic, building efficient pipelines, and ensuring seamless performance optimization in Snowflake. Key Responsibilities Analyze and extract existing data logic, queries, and transformations from Redshift. Rewrite and optimize SQL queries and data transformations in Snowflake. Design and implement ETL/data pipelines to migrate and sync data (S3 to Snowflake using Snowpipe, bulk copy, etc.). Ensure high performance through Snowflake-specific optimizations (clustering, caching, warehouse scaling). Collaborate with cross-functional teams to validate data accuracy and business requirements. Monitor, troubleshoot, and improve ongoing data workflows. Required Skills & Experience 5 - 8 years of experience in Data Engineering Strong SQL expertise in both Redshift and Snowflake. Proven experience in data migration projects, specifically Redshift to Snowflake. Hands-on experience with ETL/data pipeline development (using Python, Airflow, Glue, dbt, or similar tools). Solid understanding of AWS ecosystem, particularly S3 to Snowflake ingestion. Experience in performance tuning and optimization within Snowflake. Strong problem-solving skills and ability to work independently. Nice to Have Experience with dbt, Airflow, AWS Glue, or other orchestration tools. Knowledge of modern data architecture and best practices. Work Mode: Initially remote for 12 months, then onsite in Pune.
Posted 1 week ago
5.0 - 8.0 years
5 - 14 Lacs
gurugram, bengaluru, delhi / ncr
Work from Office
Position: ETL Tester Experience: 5-8 years Location: (Gurugram / Bangalore ) Joining: Immediate / [Specify Notice Period] Job Summary We are looking for a skilled Snowflake ETL Tester to ensure the quality, integrity, and performance of our data integration processes. The role involves validating ETL jobs, verifying data accuracy between source and target systems, and ensuring adherence to business requirements. Key Responsibilities Analyze business and technical requirements to design comprehensive ETL test cases and scenarios. Validate ETL data flows, transformations, and load processes. Perform source-to-target data validation to ensure data accuracy, completeness, and consistency. Execute functional, integration, regression, and performance testing for ETL processes. Identify, document, and track defects; work closely with developers to ensure timely resolution. Create and maintain test plans, test cases, and test data for ETL processes. Validate job scheduling, monitoring, and error handling for ETL workflows. Collaborate with Business Analysts, Data Engineers, and DBAs to understand data requirements. Ensure compliance with data governance and quality standards. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or related field. 4+ years of experience in ETL/Data Warehouse testing. Strong experience with SQL for complex data validation queries. Hands-on experience with ETL tools (e.g., Informatica, Talend, SSIS, DataStage, Snowflake ETL ). Proficient in analyzing data models, mapping documents, and transformation logic. Experience with Unix/Linux commands for job execution and log analysis. Familiarity with data warehouse concepts (star schema, snowflake schema, fact/dimension tables). Strong analytical, problem-solving, and debugging skills. Excellent communication and documentation skills. AWS or GCP Good to Have Knowledge of Cloud-based ETL/Data platforms (AWS Glue, Azure Data Factory, GCP Dataflow). Experience with automation testing tools for ETL validation (e.g., QuerySurge, Selenium with DB, Python scripts). Exposure to BI tools (Tableau, Power BI, Qlik) for data validation at the reporting layer. Understanding of Big Data technologies (Hive, Spark, Hadoop).
Posted 1 week ago
3.0 - 7.0 years
5 - 15 Lacs
hyderabad
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using Snowflake as the primary database engine. Collaborate with cross-functional teams to gather requirements and design solutions that meet business needs. Develop complex SQL queries to extract insights from large datasets stored in Snowflake tables. Troubleshoot issues related to data quality, performance tuning, and security compliance. Participate in code reviews to ensure adherence to coding standards and best practices. Desired Candidate Profile 3-7 years of experience working with Snowflake as a Data Engineer or similar role. Strong understanding of SQL programming language with ability to write efficient queries for large datasets. Proficiency in Python scripting language with experience working with popular libraries such as Pandas, NumPy, etc.
Posted 1 week ago
6.0 - 11.0 years
22 - 35 Lacs
new delhi, pune, bengaluru
Work from Office
Job Description We are looking for an experienced Data Engineer / Technical Lead with mandatory expertise in Snowflake and dbt (data build tool) , along with strong skills in Data Warehousing, Data Modeling, and Data Engineering. The ideal candidate will have hands-on experience in AWS data services (Aurora, RDS, DMS, DynamoDB), Oracle, PL/SQL, and SQL, coupled with leadership experience in customer-facing roles for requirement gathering and solution design. Roles & Responsibilities Key Responsibilities: Technical Leadership & Data Engineering: Lead end-to-end data warehousing projects using Snowflake and dbt (mandatory skills) Design and implement scalable data models, ETL/ELT pipelines, and data architecture in Snowflake Develop and optimize dbt models, transformations, and data workflows for efficient analytics Work with AWS data services - Aurora, RDS, DMS, DynamoDB - for data integration and migration Optimize SQL, PL/SQL queries and database performance (Oracle, PostgreSQL, etc.) Implement CDC (Change Data Capture) and real-time data processing solutions Ensure data governance, quality, and best practices in data engineering Requirement Analysis & Customer Handling: Collaborate with stakeholders to gather, analyze, and document business requirements Handle Change Requests (CRs) and provide technical solutions aligned with business needs Act as a bridge between business teams and technical teams for data-driven decision-making Team Leadership & Mentorship: Guide and mentor junior data engineers in Snowflake, dbt, and modern data stack best practices Ensure adherence to data security, compliance, and scalability principles Mandatory Skills & Experience: 6-13 years of hands-on experience in Data Engineering, Data Warehousing, and Data Modeling Expert-level proficiency in Snowflake and dbt (data build tool) - Mandatory Strong expertise in: AWS Data Services (Aurora, RDS, DMS, DynamoDB) Oracle, PL/SQL, SQL, and database optimization ETL/ELT pipelines & Data Integration Data Architecture & Dimensional Modeling (Star Schema, Kimball, etc.) Experience in customer handling, requirement gathering, and solution design Preferred Qualifications: Certification in Snowflake, dbt, or AWS Data Technologies Experience with Python, PySpark, or other scripting languages Knowledge of CI/CD pipelines for data workflows
Posted 1 week ago
3.0 - 5.0 years
10 - 20 Lacs
hyderabad
Hybrid
Job Title - Snowflake Data Engineer Key Skills - Snowflake , Snowpipe , Data Warehousing , Advance SQL Job Description: Design, develop, and maintain robust ELT/ETL data pipelines to load structured and semi-structured data into Snowflake. Implement data ingestion workflows using tools like Azure Data Factory, Informatica, DBT, or custom Python/SQL scripts. Write and optimize complex SQL queries, stored procedures, views, and UDFs within Snowflake. Use Snowpipe for continuous data ingestion and manage tasks, streams, and file formats for near real-time processing Optimize query performance using techniques like clustering keys, result caching, materialized views, and pruning strategies. Monitor and tune warehouse sizing and usage to balance cost and performance. Design and implement data models (star, snowflake, normalized, or denormalized) suitable for analytical workloads. Create logical and physical data models for reporting and analytics use cases.
Posted 1 week ago
3.0 - 8.0 years
5 - 15 Lacs
hyderabad
Hybrid
Experience: 3 to 10 years Location Preference: Hyderabad Hands-on experience with Snowflake data warehousing Strong proficiency in SQL for data manipulation and analysis Working knowledge of Snowpipe for real-time data ingestion Experience with data pipelines, ETL processes, and cloud platforms If you're ready to take the next step in your data career, and showcase your skills please email , our team is eager to meet passionate professionals like you! For any queries, feel free to reach out at priyanka.tiwari@brillio.com
Posted 1 week ago
6.0 - 10.0 years
15 - 25 Lacs
hyderabad, chennai
Hybrid
Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a DBT professional, you will be responsible for designing, developing, and defining technical architecture for data pipelines and performance scaling in a big data environment. Your expertise in PL/SQL, including queries, procedures, and JOINs, will be crucial for the integration of Talend data and ensuring data quality. You will also be proficient in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts in Unix, Python, etc., to facilitate Extract, Load, and Transform operations. It would be advantageous to have hands-on experience and knowledge of Talend. Candidates with previous experience in PROD support will be given preference. Your role will involve working with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. You will be responsible for data analysis, troubleshooting data issues, and providing technical support to end-users. In this position, you will develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Your problem-solving skills will be put to the test, and you will be expected to have a continuous improvement approach. Possessing Talend/Snowflake Certification would be considered desirable. Excellent SQL coding skills, effective communication, and documentation skills are essential. Knowledge of the Agile delivery process is preferred. You must be analytical, creative, and self-motivated to excel in this role. Collaboration within a global team environment is key, necessitating excellent communication skills. Your contribution to Virtusa will be valued, as teamwork, quality of life, and professional development are the core values the company upholds. By joining a global team of 27,000 professionals, you will have access to exciting projects and opportunities to work with cutting-edge technologies throughout your career. Virtusa provides an environment that nurtures new ideas, fosters excellence, and encourages personal and professional growth.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a skilled Snowflake Data Engineer with over 4 years of experience, you will be responsible for designing and implementing scalable data solutions using Snowflake as a cloud data platform. Your expertise in data modelling, ETL processes, and performance optimization will be crucial for the success of our projects. Your key responsibilities will include developing and optimizing data pipelines on Snowflake, managing databases and schemas, and integrating data from various sources using ETL/ELT tools like Talend, Informatica, or Matillion. You will also be expected to monitor and optimize query performance, design data models based on business requirements, and collaborate with data analysts and other stakeholders to deliver effective data solutions. To excel in this role, you must have at least 2 years of experience working specifically with Snowflake, strong SQL skills, and knowledge of data warehousing concepts. Your familiarity with cloud platforms like AWS, Azure, or GCP, hands-on experience with Snowflake utilities, and proficiency in ETL tools and scripting languages will be essential. Preferred qualifications for this position include SnowPro Certification, experience in Agile/Scrum development methodologies, and knowledge of data governance and compliance standards such as GDPR and HIPAA. Strong problem-solving, analytical, communication, and teamwork skills are also required to succeed in this role. If you are passionate about data engineering, have a proven track record in Snowflake development, and possess the technical and soft skills necessary to thrive in a collaborative environment, we encourage you to apply for this full-time, permanent position with us. The work schedule is day shift, Monday to Friday, and the location is remote. Kindly respond to the following application questions: 1. Do you have SnowPro Certification or equivalent credentials 2. How many years of experience do you have in Agile/Scrum development methodologies 3. How many years of experience do you have as a Snowflake Data Engineer 4. What is your current monthly salary 5. What is your expected salary 6. What is your notice period Join us in revolutionizing data solutions and making a real impact in the industry!,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |