Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 20.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Detailed JD *(Roles and Responsibilities) 8 + years working experience in Power BI and its advanced components PBI Desktop Service, Power View, Power Query, Power Pivot, Power BI Dashboards, Power BI gateway. Writing DAX queries, Power Automate in Power BI, PBI AI, Data Analytical. Creating reports and dashboards using BI tools such as MS Power BI to visualize data and key performance indicators KPIs Knowledge on cloud-based data platforms and services like Snowflake, AWS, GCP, Azure. Performance analysis and suggest best approaches to team to improve the performance Good experience in consuming data from different sources and designing the Power BI Data Models Hands on Performance Optimization in Power BI reports and dashboards also suggest best approaches to team to improve the performance. Excellent Knowledge on Data Modeling & ETL. Expert in providing BI Architecture Solution Collaborate with team and mentor team members. Mandatory skills*: Power BI (very good in advanced concepts) and BI Solution Architecture Desired skills*: Snowflake SQL, ETL and Data Modelling
Posted 1 month ago
6.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job location Bangalore Job Title: Module Lead - SnowFlake Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory
Posted 1 month ago
4.0 - 9.0 years
1 - 2 Lacs
Hyderabad
Hybrid
seeking an experienced Snowflake Developer to design, develop, and optimize data solutions on the Snowflake cloud data platform. The ideal candidate will have strong SQL skills, experience with ETL/ELT processes, and a deep understanding of cloud data warehousing concepts. Key Responsibilities Develop and maintain scalable data pipelines and workflows using Snowflake. Design and implement complex SQL queries, stored procedures, and views. Optimize Snowflake performance including query tuning and resource management. Collaborate with data engineers, analysts, and business teams to understand data requirements. Implement data security and governance best practices within Snowflake. Integrate Snowflake with various ETL tools and data sources. Monitor and troubleshoot data pipelines and Snowflake environments. Required Skills and Qualifications 3+ years of experience working with Snowflake or similar cloud data warehouses (Redshift, BigQuery). Expertise in writing advanced SQL queries, stored procedures, and scripts. Hands-on experience with ETL/ELT tools such as Talend, Informatica, Matillion, dbt, or Apache Airflow. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of data modeling, data warehousing concepts, and performance tuning. Strong analytical and problem-solving skills. Experience with version control systems like Git. Preferred (Nice to Have) Experience with data visualization tools like Tableau, Power BI, or Looker. Knowledge of scripting languages such as Python or JavaScript. Understanding of DevOps and CI/CD pipelines for data engineering.
Posted 1 month ago
3.0 - 8.0 years
9 - 16 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years fulltime education Share CV on - neha.mandal@mounttalent.com Summary: As an Application Lead for Packaged Application Development, you will be responsible for leading the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve working with Snowflake Data Warehouse and collaborating with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities: - Lead the design, development, and implementation of applications using Snowflake Data Warehouse. - Collaborate with cross-functional teams to ensure the delivery of high-quality solutions that meet business requirements. - Act as the primary point of contact for all application-related issues, providing technical guidance and support to team members. - Ensure that all applications are designed and developed in accordance with industry best practices and standards. - Provide technical leadership and mentorship to team members, ensuring that they have the necessary skills and knowledge to deliver high-quality solutions. Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.
Posted 1 month ago
5.0 - 6.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Job Title: Developer Work Location: Hyderabad,TG and Bangalore,KA Skill Required: Utilities - Digital : Snowflake Experience Range in Required Skills: 4-6yrs Job Description: Snowflake Essential Skills: Snowflake Desirable Skills: Snowflake
Posted 1 month ago
8.0 - 13.0 years
14 - 24 Lacs
Bengaluru
Remote
Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 8+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Hybrid
Role & responsibilities Designed and implemented end-to-end data pipeline using DBT, Snowflake Created and structure DBT models like staging, transformation, marts, YAML configurations for models and tests, dbt seeds. Hands-on experience on DBT Jinja templating, macro development, dbt jobs and snapshot management for Slowly changing dimensions. Develop python script for data cleaning, transformation and automation of repetitive task. Experienced in loading structured and semi-structured data from AWS S3 to Snowflake by designing file formats, configuring storage integration, and automating data loads using Snow pipe. Designed scalable incremental models for handling large datasets, reducing resource usage Preferred candidate profile Candidate must have 5+ Yrs experience. Early joiner, who can join within a month
Posted 1 month ago
7.0 - 12.0 years
19 - 34 Lacs
Hyderabad
Hybrid
Data Scientist Job Description Responsibilities Work with team members across multiple disciplines to understand the data behind product features user behaviors the security landscape and our goals Analyze data from several large sources then automate solutions using scheduled processes models and alerts Work with partners to design and improve metrics that guide our decisions for the product Detect patterns associated with fraudulent accounts and anomalous behavior Solve scientific problems and create new methods independently Translate requirements and security questions into data insights Set up alerting mechanisms so our leadership is always aware of the security posture Qualifications Postgraduate degree with specialization in machine learning artificial intelligence statistics or related fields or 2 years of equivalent work experience in applied machine learning and analytics Experience with SQL Snowflake and NoSQL databases Proficiency in Python programming Familiarity with statistics modeling and data visualization Experience Experience building statistical and machine learning models applying techniques such as regression classification clustering and anomaly detection Time series and Classical ML modeling Familiarity with Snowflake SQL Familiarity with cloud platforms such as AWS Some experience to software development or data engineering Analyze business problems or research questions identify relevant data points and extract meaningful insights
Posted 1 month ago
5.0 - 8.0 years
5 - 8 Lacs
Chennai, Tamil Nadu, India
On-site
Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.
Posted 1 month ago
2.0 - 6.0 years
2 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.
Posted 1 month ago
2.0 - 6.0 years
2 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.
Posted 1 month ago
9.0 - 14.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Job Description: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Role & responsibilities Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Work with data and analytics experts to strive for greater functionality in our data systems. Assemble large, complex data sets that meet functional / non-functional business requirements. – Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. – Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. – Unit Test databases and perform bug fixes. – Develop best practices for database design and development activities. – Take on technical leadership responsibilities of database projects across various scrum teams. Manage exploratory data analysis to support dashboard development (desirable) Required Skills: – Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). – Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. – Understanding of data modelling techniques and working knowledge with OLAP systems – Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. – In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. – Ability to fine tune report generating queries. – Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. – Understanding of index design and performance-tuning techniques – Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions – Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). – Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions – Exposure to Source control like GIT, Azure DevOps – Understanding of Agile methodologies (Scrum, Itanban) – experience with NoSQL database to migrate data into other type of databases with real time replication (desirable). – Experience with CI/CD automation tools (desirable) – Programming language experience in Golang, Python, any programming language, Visualization tools (Power BI/Tableau) (desirable).
Posted 1 month ago
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Chennai
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 month ago
5.0 - 7.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.
Posted 1 month ago
4.0 - 7.0 years
7 - 17 Lacs
Gurugram
Hybrid
Job Title: Snowflake Data Engineer Location: Gurgaon Notice Period: Immediate to 30 Days Job Description & Summary We are seeking a Snowflake Data Engineer to join our team and enhance our data solutions. The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that supports analytical and operational requirements. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organization in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistency. Development of efficient stored procedure and understanding of complex business logic to meet system requirements and enhance functionality effectively. Implement best practices in data security, role-based-access control, and data masking within Snowflake to maintain compliance and data governance standards. Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Ability to work independently with minimal supervision, deliver projects within committed timelines, and excel in ambiguous environments with a strong problem-solving approach. Capable of performing root cause analysis (RCA) and implementing changes to address issues and improve processed effectively. Develop comprehensive documentation for data structures, ETL workflows and system processes to ensure transparency and knowledge sharing within the team. Experienced in automating and monitoring Snowflakes tasks, identifying issues, troubleshooting errors, and implementing fixes to ensure seamless operations. Mandatory skill sets: Snowflake, SQL(Intermediate/Advance), Python (Elementary /Intermediate) Good to have skill sets: Azure Data Factory (Elementary /Intermediate), Power BI, JIRA
Posted 1 month ago
5.0 - 10.0 years
18 - 22 Lacs
Bengaluru
Hybrid
We are looking for a candidate seasoned in handling Data Warehousing challenges. Someone who enjoys learning new technologies and does not hesitate to bring his/her perspective to the table. We are looking for someone who is enthusiastic about working in a team, can own and deliver long-term projects to completion. Responsibilities: • Contribute to the teams vision and articulate strategies to have fundamental impact at our massive scale. • You will need a product-focused mindset. It is essential for you to understand business requirements and architect systems that will scale and extend to accommodate those needs. • Diagnose and solve complex problems in distributed systems, develop and document technical solutions and sequence work to make fast, iterative deliveries and improvements. • Build and maintain high-performance, fault-tolerant, and scalable distributed systems that can handle our massive scale. • Provide solid leadership within your very own problem space, through data-driven approach, robust software designs, and effective delegation. • Participate in, or spearhead design reviews with peers and stakeholders to adopt what’s best suited amongst available technologies. • Review code developed by other developers and provided feedback to ensure best practices (e.g., checking code in, accuracy, testability, and efficiency) • Automate cloud infrastructure, services, and observability. • Develop CI/CD pipelines and testing automation (nice to have) • Establish and uphold best engineering practices through thorough code and design reviews and improved processes and tools. • Groom junior engineers through mentoring and delegation • Drive a culture of trust, respect, and inclusion within your team. Minimum Qualifications: • Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience • Min 5 years of experience curating data and hands on experience working on ETL/ELT tools. • Strong overall programming skills, able to write modular, maintainable code, preferably Python & SQL • Strong Data warehousing concepts and SQL skills. Understanding of SQL, dimensional modelling, and at least one relational database • Experience with AWS • Exposure to Snowflake and ingesting data in it or exposure to similar tools • Humble, collaborative, team player, willing to step up and support your colleagues. • Effective communication, problem solving and interpersonal skills. • Commit to grow deeper in the knowledge and understanding of how to improve our existing applications. Preferred Qualifications: • Experience on following tools – DBT, Fivetran, Airflow • Knowledge and experience in Spark, Hadoop 2.0, and its ecosystem. • Experience with automation frameworks/tools like Git, Jenkins Primary Skills Snowflake, Python, SQL, DBT Secondary Skills Fivetran, Airflow,Git, Jenkins, AWS, SQL DBM
Posted 1 month ago
4.0 - 8.0 years
5 - 15 Lacs
Kolkata
Work from Office
Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF
Posted 1 month ago
6.0 - 11.0 years
18 - 22 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Greetings from Primus Global Technology!!! We are hiring for a Snowflake Administrator role with a leading MNC for locations including Bangalore, Chennai & Hyderabad This is a contract position (6 months to 1 year) with a potential for extension based on performance. The selected candidate will be on Primus Global payroll . Experience Required: 6+ years (4.5+ in Snowflake) Salary: 1,50,000 to 1,80,000 per month Contract Duration: 6 - 12 months (Extendable based on performance) Payroll: Primus Global Technology Note: Only candidates with experience as a Snowflake Administrator are eligible for this position. This opening is not for Snowflake Developers. Key Responsibilities: Database Management: Snowflake account/user management, performance tuning, backups Security: Implement RBAC, encryption, and compliance policies Cost Management: Monitor and optimize Snowflake costs ETL & Integration: Support data pipelines and integration with other systems Performance Tuning: Improve query and system performance Support: Troubleshooting and vendor escalation Collaboration: Work with architects and stakeholders, provide system health reports Apply Now! Send your resume to: npandya@primusglobal.com Looking for the immediate joiner Contact: Nidhi P Pandya Sr. Associate Talent Acquisition Primus Global Technology Pvt. Ltd. All THE BEST JOB SEEKERS
Posted 2 months ago
4.0 - 6.0 years
7 - 14 Lacs
Udaipur, Kolkata, Jaipur
Hybrid
Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 46 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4-6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 2 months ago
7.0 - 8.0 years
8 - 18 Lacs
Pune
Hybrid
Warm Greetings From Dataceria !!! --------------------------------------------------------------------------------- As a Senior SQL Quality Assurance Tester , / Senior ETL tester Immediate joiners send your resume to carrers@dataceria.com: ------------------------------------------------------------------------------- you will be at the forefront of ensuring the quality and reliability of our data systems. You will play a critical role in analysing raw data, building test frameworks, and validating data products using Python. Collaborating closely with data analytics experts and stakeholders, you will contribute to the stability and functionality of our data pipelines. This role offers an exciting opportunity to work with cutting-edge technologies and make a significant impact on our data engineering processes. Responsibilities: Analyse and organise raw data to meet business needs and objectives. Develop, update, and maintain SQL scripts and test cases as applications and business rules evolve, identifying areas for improvement. Delegate tasks effectively, ensuring timely and accurate completion of deliverables. Partner with stakeholders, including Product, Data and Design teams, to address technical issues and support data engineering needs. Perform root cause analysis of existing models and propose effective solutions for improvement. Serve as a point of contact for cross-functional teams, ensuring the smooth integration of quality assurance practices into workflows. Demonstrate strong time management skills. Lead and mentor a team of SQL testers and data professionals, fostering a collaborative and high-performing environment. What we're looking for in our applicants: +7 years of relevant experience in data engineering and testing roles and team management. Proven experience leading and mentoring teams, with strong organizational and interpersonal skills. Proficiency in SQL testing , with a focus on Snowflake, and experience with Microsoft SQL Server . Advanced skills in writing complex SQL queries . At least intermediate level proficiency in Python programming. Experienced with Python libraries for testing and ensuring data quality. Hands-on experience with Git Version Control System (VCS). Working knowledge of cloud computing architecture(Azure ADO) Experience with data pipeline and workflow management tools like Airflow. Ability to perform root cause analysis of existing models and propose effective solutions. Strong interpersonal skills and a collaborative team player -------------------------------------------------------------------------------------------------------- Nice to have 1.ETL testing broader Knowledge and experience 2.Confluence 3.Strong in SQL queries. 4.Data warehouse 5. Snowflake 6.Cloud platform(Azure ADO) ------------------------------------------------------------------------------------------------------- Joining : Immediate: Work location: Pune (hybrid). Open Positions : 1 -Senior SQL Quality Assurance Tester , If interested please share your updated resume to carrers@dataceria.com : -------------------------------------------------------------------------------------------------------- Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com
Posted 2 months ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Role & responsibilities 1. Senior Snowflake Developer-Experience- 8+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office Notice Period- immediate to 15 days CTC: 37 Lakhs JD: Summary ThoughtFocus is looking for a senior snowflake developer for our NYC and London based financial services client operating in Public/Private Loans, CLOs, and Long/Short Credit. You will play a pivotal role in accomplishing the successful delivery of our strategic initiatives. You will be responsible for developing solutions using technologies like Snowflake, Coalesce & Fivetran. Location: Bengaluru, India Requirements: IT experience of 8+ years with a minimum of 3+ years of experience as a Snowflake Developer. Design, develop, and optimize Snowflake objects such as databases, schemas, tables, views, and store procedures. Expertise in Snowflake utilities such as Snow SQL, Snow Pipe, Stages, Tables, Zero Copy Clone, Streams and Tasks, Time travel, data sharing, data governance, and row access policy. Experience in migrating data from Azure Cloud to Snowflake, ensuring data integrity, performance optimization, and minimal disruption to business operations. Experience in Snow pipe for continuous loading and unloading data into Snowflake tables. Experience in using the COPY, PUT, LIST, GET, and REMOVE commands. Experience in Azure integration and data loading (Batch and Bulk loading). Experience in creating the System Roles & Custom Roles and Role Hierarchy in Snowflake. Expertise in masking policy and network policy in Snowflake. Responsible for designing and maintaining ETL tools (Coalesce & Fivetran) that include extracting the data from the MS-SQL Server database and transforming the data as per Business requirements. Extensive experience in writing complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes, and Exception Handling using MS-SQL Server (TSQL). Effective communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. What's on offer Competitive and above market salary. Hybrid work schedule. Opportunity to get exposure and technology experience in global financial markets. Education Bachelor's degree in Computer Science / IT / Finance / Economics or equivalent. 2. Please find the below Lead Snowflake JD Location: Bangalore( UK Shift) 3 days work from Office CTC:45 Lakhs 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. • Strong communication skills and interpersonal skills. • Ability to influence and collaborate effectively with cross-functional teams. • Exceptional problem-solving and decision-making abilities. • Experience in working in an agile development environment. • Experience working in a fast-paced, dynamic environment. • Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, functions, and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL, Snowflake 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools Should be able to define and enforce Best Practices Good communication skills to be able to discuss and deliver requirements effectively with the client Preferred candidate profile
Posted 2 months ago
5.0 - 9.0 years
5 - 9 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Job description Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.
Posted 2 months ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 2 months ago
5.0 - 10.0 years
10 - 20 Lacs
Pune, Chennai
Hybrid
Snowflake + SQL 5 to 15 Yrs Pune /Chennai If shortlisted candidate should be available for F2F interview Pune & Chennai
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough