Home
Jobs

1714 Snowflake Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

13 - 17 Lacs

Chennai

Work from Office

Naukri logo

InfoCepts is looking for Data Architect- Snowflake & DBT to join our dynamic team and embark on a rewarding career journey Design and Development: Create and implement data warehouse solutions using Snowflake, including data modeling, schema design, and ETL (Extract, Transform, Load) processes Performance Optimization: Optimize queries, performance-tune databases, and ensure efficient use of Snowflake resources for faster data retrieval and processing Data Integration: Integrate data from various sources, ensuring compatibility, consistency, and accuracy Security and Compliance: Implement security measures and ensure compliance with data governance and regulatory requirements, including access control and data encryption Monitoring and Maintenance: Monitor system performance, troubleshoot issues, and perform routine maintenance tasks to ensure system health and reliability Collaboration: Collaborate with other teams, such as data engineers, analysts, and business stakeholders, to understand requirements and deliver effective data solutions Skills and Qualifications:Snowflake Expertise: In-depth knowledge and hands-on experience working with Snowflake's architecture, features, and functionalities SQL and Database Skills: Proficiency in SQL querying and database management, with a strong understanding of relational databases and data warehousing concepts Data Modeling: Experience in designing and implementing effective data models for optimal performance and scalability ETL Tools and Processes: Familiarity with ETL tools and processes to extract, transform, and load data into Snowflake Performance Tuning: Ability to identify and resolve performance bottlenecks, optimize queries, and improve overall system performance Data Security and Compliance: Understanding of data security best practices, encryption methods, and compliance standards (such as GDPR, HIPAA, etc) Problem-Solving and Troubleshooting: Strong analytical and problem-solving skills to diagnose and resolve issues within the Snowflake environment Communication and Collaboration: Good communication skills to interact with cross-functional teams and effectively translate business requirements into technical solutions Scripting and Automation: Knowledge of scripting languages (like Python) and experience in automating processes within Snowflake

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 week ago

Apply

6.0 - 11.0 years

40 Lacs

Chennai

Hybrid

Naukri logo

Data Architect/Engineer and implement data solutions across Retail industry(SCM, Marketing, Sales, and Customer Service , using technologies such as DBT , Snowflake , and Azure/AWS/GCP . Design and optimize data pipelines that integrate various data sources (1st party, 3rd party, operational) to support business intelligence and advanced analytics. Develop data models and data flows that enable personalized customer experiences and support omnichannel marketing and customer engagement. Lead efforts to ensure data governance , data quality , and data security , adhering to compliance with regulations such as GDPR and CCPA . Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs. Optimize workflows using DBT to streamline data transformation and modeling processes. Leverage Azure for cloud infrastructure, data storage, and real-time data analytics, while ensuring the architecture supports scalability and performance. Collaborate with cross-functional teams, including data engineers, analysts, and business stakeholders, to ensure data architectures meet business needs. Support both real-time and batch data integration , ensuring data is accessible for actionable insights and decision-making. Continuously assess and integrate new data technologies and methodologies to enhance the organizations data capabilities. Qualifications: 6+ years of experience in Data Architecture or Data Engineering, with specific expertise in DBT , Snowflake , and Azure/AWS/GCP . Strong understanding of data modeling , ETL/ELT processes , and modern data architecture frameworks. Experience designing scalable data architectures for personalization and customer analytics across marketing, sales, and customer service domains. Expertise with cloud data platforms (Azure preferred) and Big Data technologies for large-scale data processing. Hands-on experience with Python for data engineering tasks and scripting. Proven track record of building and managing data pipelines and data warehousing solutions using Snowflake . Familiarity with Customer Data Platforms (CDP) , Master Data Management (MDM) , and Customer 360 architectures. Strong problem-solving skills and ability to work with cross-functional teams to translate business requirements into scalable data solutions. Role & responsibilities

Posted 1 week ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).

Posted 1 week ago

Apply

12.0 - 15.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Java Enterprise Edition Good to have skills : Enterprise Architecture FrameworkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :The Principal Engineer is a key leadership role within the engineering team, responsible for overseeing and guiding complex technical projects, driving innovation, and ensuring the successful delivery of high-quality software products. This role involves a combination of hands-on technical work, strategic planning, and team mentorship. If you have drive to lead and mentor engineering teams, ability to set technical direction for greenfield SaaS products and excellent communication skills to work with cross-functional teams, including product management and other stakeholders, this position is for you Roles & Responsibilities:Prepare technical design specifications based on functional requirements and analysis documents. Provide written knowledge transfer material.Review functional requirements, analysis, and design documents and provide feedback.Implement, test, maintain and support software, based on technical design specifications.Improve system quality by identifying issues and common patterns and developing standard operating procedures.Enhance applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Maintain and improve existing codebases and peer review code changes.Liason with colleagues to implement technical designs.Investigate and use new technologies where relevant.Assist customer support with technical problems and questions.Ability to work independently with wide latitude for independent decision making.Experience in leading the work of others and mentor less experienced developers in the context of a project is a plus.Ability to listen and understand information and communicate the same.Participate in architecture and code reviews.Lead or participate in other projects or duties as need arises. Professional & Technical Skills: -The Winning Way behaviors that all employees need in order to meet the expectations of each other, our customers, and our partners.- Communicate with Clarity - Be clear, concise and actionable. Be relentlessly constructive. Seek and provide meaningful feedback.- Act with Urgency - Adopt an agile mentality - frequent iterations, improved speed, resilience. 80/20 rule - better is the enemy of done. Dont spend hours when minutes are enough.- Work with Purpose - Exhibit a We Can mindset. Results outweigh effort. Everyone understands how their role contributes. Set aside personal objectives for team results.- Drive to Decision - Cut the swirl with defined deadlines and decision points. Be clear on individual accountability and decision authority. Guided by a commitment to and accountability for customer outcomes.- Own the Outcome - Defined milestones, commitments and intended results. Assess your work in context, if youre unsure, ask. Demonstrate unwavering support for decisions.-(MUST HAVE) 10+ years of experience developing systems/software for large business environments-(MUST HAVE) Strong OOD and SOA principles, with ability to implement them in a language of choice (Java preferable)-(MUST HAVE) Strong experience leading architecture, design and implementation of robust and highly scalable web services.-(MUST HAVE) Experience working with AWS and/or Azure SaaS infrastructure and CI/CD DevOps technologies, and extensive debugging experience.-(MUST HAVE) An understanding of unit testing, test driven development, functional testing, and performance testing.-Experience building front end with React is a big plus.-Knowledge of database systems (SQL, NoSQL) and data architecture.-Experience working and integrating with Event Bus like Pulsar is a big plus.-Experience working and integrating with cloud based big data solutions like Snowflake is a big plus.-Working experience with software security-enhancing tools and best practices.-Knowledge of at least one shell scripting language.-Understanding of industry-leading technology/solutions in big data and machine learning.-Ability to operate at highly varying levels of abstraction, from business strategy to product strategy to high-level technical design to detailed technical design to implementation.-Ability to work effectively in a fast-paced, complex technical environment.-Experience driving for results across cross-functional teams while maintaining effective working relationships.-Must possess strong interpersonal, organizational, presentation and facilitation skills.-Must be results oriented and customer focused with an ability to make successful trade-offs that balance short- and long-term product goals.-High-energy, self-starter with a positive mindset and with a can do attitude. Additional Information:- The candidate should have minimum 12 years of experience in Java Enterprise Edition.- Bachelors degree in computer science, Information Systems, or related field; or equivalent combination of education/experience. Masters degree is a plus.-10 years or more of extensive experience developing mission critical and low latency solutions for large business environments.-At least 5 years of experience with developing and debugging distributed systems and working with big data systems in the cloud.- This position is based at our Hyderabad office. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will engage in problem-solving discussions, contribute innovative ideas, and refine applications based on user feedback, all while maintaining a focus on quality and efficiency in your work. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Collaborate with cross-functional teams to ensure alignment on project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools and ETL processes.- Strong understanding of SQL and database management.- Familiarity with cloud computing concepts and services.- Experience in application development methodologies. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :seeking a hands-on Senior Engineering Manager of Data Platform to spearhead the development of capabilities that power Vertex products while providing a connected experience for our customers. This role demands a deep engineering background with hands-on experience in building and scaling production-level systems. The ideal candidate will excel in leading teams to deliver high-quality data products and will provide mentorship, guidance, and leadership.In this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform. Roles & Responsibilities:Be hands-on in leading the development of features that enhance self-service capabilities of our data platform, ensuring the platform is scalable, reliable, and fully aligned with business objectives, and defining and implementing best practices in data architecture, data modeling, and data governance.Work closely with Product, Engineering, and other departments to ensure the data platform meets business requirements.Influence cross-functional initiatives related to data tools, governance, and cross-domain data sharing. Ensure technical designs are thoroughly evaluated and aligned with business objectives.Determine appropriate recruiting of staff to achieve goals and objectives. Interview, recruit, develop and retain top talent.Manage and mentor a team of engineers, fostering a collaborative and high-performance culture, and encouraging a growth mindset and accountability for outcomes. Interpret how the business strategy links to individual roles and responsibilities.Provide career development opportunities and establish processes and practices for knowledge sharing and communication.Partner with external vendors to address issues, and technical challenges.Stay current with emerging technologies and industry trends in field to ensure the platform remains cutting-edge. Professional & Technical Skills: 12+ years of hands-on experience in software development (preferably in the data space), with 3+ years of people management experience, demonstrating success in building, growing, and managing multiple teams.Extensive experience in architecting and building complex data platforms and products. In-depth knowledge of cloud-based services and data tools such as Snowflake, AWS, Azure, with expertise in data ingestion, normalization, and modeling.Strong experience in building and scaling production-level cloud-based data systems utilizing data ingestion tools like Fivetran, Data Quality and Observability tools like Monte Carlo, Data Catalog like Atlan and Master Data tools like Reltio or Informatica.Thorough understanding of best practices regarding agile software development and software testing.Experience of deploying cloud-based applications using automated CI/CD processes and container technologies.Understanding of security best practices when architecting SaaS applications on cloud Infrastructure.Ability to understand complex business systems and a willingness to learn and apply new technologies as needed.Proven ability to influence and deliver high-impact initiatives. Forward-thinking mindset with the ability to define and drive the teams mission, vision, and long-term strategies.Excellent leadership skills with a track record of managing teams and collaborating effectively across departments. Strong written and communication skills.Proven ability to work with and lead remote teams to achieve sustainable long-term success.Work together and Get Stuff Done attitude without losing sight of quality, and a sense of responsibility to customers and the team. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships

Posted 1 week ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

Indore, Pune, Bengaluru

Work from Office

Naukri logo

LocationsPune, Bangalore, Indore Work modeWork from Office Informatica data quality - idq Azure databricks Azure data lake Azure Data Factory Api integration

Posted 1 week ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR

Posted 1 week ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 1 week ago

Apply

6.0 - 11.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

6+ years of industry work experience Experience extracting data from a variety of sources, and a desire to expand those skills Worked on Google Looker tool Worked on Big Query and GCP technologies Strong SQL and Spark knowledge Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Knowledge of Financial Accounting is a bonus Work independently with cross functional team and drive towards the resolution Experience with Object oriented programming using python and its design patterns Experience handling Unix systems, for optimal usage to host enterprise web applications GCP certifications preferred. Payments Industry Background good to have Candidate who has been part to google Cloud Migration is an ideal Fit Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3-5 years of experience Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience 6+ years of industry work experience Experience extracting data from a variety of sources, and a desire to expand those skills Worked on Google Looker tool

Posted 1 week ago

Apply

6.0 - 11.0 years

13 - 17 Lacs

Gurugram

Work from Office

Naukri logo

6+ years of industry work experience Experience extracting data from a variety of sources, and a desire to expand those skills Worked on Google Looker tool Worked on Big Query and GCP technologies Strong SQL and Spark knowledge Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Knowledge of Financial Accounting is a bonus Work independently with cross functional team and drive towards the resolution Experience with Object oriented programming using python and its design patterns Experience handling Unix systems, for optimal usage to host enterprise web applications GCP certifications preferred. Payments Industry Background good to have Candidate who has been part to google Cloud Migration is an ideal Fit Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3-5 years of experience Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience 6+ years of industry work experience Experience extracting data from a variety of sources, and a desire to expand those skills Worked on Google Looker tool

Posted 1 week ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships) Preferred technical and professional experience Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude.

Posted 1 week ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Expert in SQL – can do data analysis and investigation using Sql queries Implementation Knowledge Advance Sql functions like – Regular Expressions, Aggregation, Pivoting, Ranking, Deduplication etc. BigQuery and BigQuery Transformation (using Stored Procedures) Data modelling concepts – Star & Snowflake schemas, Fact & Dimension table, Joins, Cardinality etc Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform

Posted 1 week ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted 1 week ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have Location: Hyderabad, Pune, Bangalore

Posted 1 week ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities The Application Head is Responsible for overseeing the Deployment and Management of Enterprise Applications with a strong focus on S/4HANA and Retail Industry Solutions. Extensive Experience in both SAP and non SAP Enterprise Solution Deployments, Ensuring Seamless Integration and Operational Efficiency across the Organization. Oversee the Deployment, Management and Optimization of S/4HANA and other Enterprise Applications. Ensure Seamless Integration of SAP and non-SAP Solutions to support Business Processes. Qualifications Bachelors or Master’s degree in Computer Science, Information Technology or related field. 10+ Years of Experience in Application Management and Deployment Roles. Strong knowledge of S/4HANA, Retail Industry Solutions and non-SAP Enterprise Applications. Preferred Skill Experience in Retail (Brand Retailing is a plus). Familiarity with Tools like Power BI, Tableau, Databricks, Snowflake or similar. Understanding of Ethical AI and Responsible Data Use.

Posted 1 week ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies