Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 9.0 years
25 - 35 Lacs
Kochi, Chennai, Bengaluru
Work from Office
Experience Data Engineer (Python, Pyspark, Snowflake)
Posted 3 weeks ago
8.0 - 12.0 years
20 - 27 Lacs
Hyderabad, Bengaluru
Hybrid
Role Data Engineer Years of Experience 8-12Yrs Preferred Location HYD 1 Shift Timing (IST) 11:00 AM - 08:30 PM Short Description Data Engineer with ADF, Databricks Anticipated Onboarding Date 1-Apr-2025 Engagement & Project Overview We have multiple Data applications under Financial Accounting span, which are planned to migrate from on-prem (Mainframe or DataStage) to Azure Cloud. Primary Responsibilities Should have relevant Experience at least 8 to 12 years of experience in Data Engineering technologies Should be able to Design, implement and maintain Data applications across all phases of Software Development Should be able to interact with business for requirements and convert into design documents Good to have Healthcare domain knowledge Strong analytical and problem-solving skills Well versed with agile processes and open to work on application support and Flexi working hours Must Have Skills Azure Data Factory (ADF), Databricks & PySpark Should have experience in any Databases Excellent Communication Nice To Have Skills Snowflake Healthcare domain knowledge Any cloud platforms, preferably Azure
Posted 3 weeks ago
6.0 - 11.0 years
15 - 25 Lacs
Bengaluru
Remote
Location - Remote Experience - 6-12 years Immediate Joiners preferred Required Qualifications: Bachelors degree in Computer Science, Information Systems, or a related field. 35 years of experience in data engineering, cloud architecture, or Snowflake administration. Hands-on experience with Snowflake features: Snowpipe, Streams, Tasks, External Tables, and Secure Data Sharing. Proficiency in SQL , Python , and data movement tools (e.g., AWS CLI, Azure Data Factory, Google Cloud Storage Transfer). Experience with data pipeline orchestration tools such as Apache Airflow , dbt , or Informatica . Strong understanding of cloud storage services (S3, Azure Blob, GCS) and working with external stages. Familiarity with network security , encryption , and data compliance best practices
Posted 3 weeks ago
10.0 - 15.0 years
30 - 45 Lacs
New Delhi, Bengaluru
Hybrid
About ICF ICF (NASDAQ:ICFI) is a global consulting services company with over 9,000 full- and part-time employees, but we are not your typical consultants. At ICF, business analysts and policy specialists work together with digital strategists, data scientists and creatives. We combine unmatched industry expertise with cutting-edge engagement capabilities to help organizations solve their most complex challenges. Since 1969, public and private sector clients have worked with ICF to navigate change and shape the future. Learn more at icf.com. Job Description: We are seeking a highly skilled and motivated Cloud DevOps Engineer with a strong background in computer science, software development, cloud networking, and security principles. The ideal candidate will have extensive experience in cloud technologies and DevOps practices and will be responsible for designing, implementing, and maintaining scalable and secure cloud infrastructure. Key Responsibilities: Design, deploy, and manage scalable, secure, and reliable cloud infrastructure. Implement DevSecOps practices to enhance security across the development lifecycle. Automate the deployment, monitoring, and management of cloud-based applications. Collaborate with cross-functional teams to ensure seamless integration and continuous delivery. Build and manage CI/CD pipelines to streamline development and deployment processes. Monitor system performance, troubleshoot issues, and optimize infrastructure for cost and efficiency. Ensure adherence to security and compliance best practices across all cloud environments, including permission sets, Service Control Policies (SCPs), and other advanced security principles. Architect and implement robust cloud networking solutions, including VPCs, hub-and-spoke models, and cross-cloud provider connectivity. Leverage SecOps technologies like Rapid7, Splunk, or similar tools to enhance system security and visibility. Drive cost optimization efforts by analyzing infrastructure usage and implementing strategies to minimize costs while maintaining performance and reliability. Stay up-to-date with industry trends and emerging technologies to drive innovation. Qualifications: Bachelors degree in computer science or a related field. Experience in software development and cloud DevOps (internships, projects, or coursework). Strong knowledge of cloud platforms such as AWS, Azure, or Google Cloud. Proficiency in cloud networking concepts, including VPC design, subnetting, hub-and-spoke architectures, and hybrid or multi-cloud connectivity. Expertise in scripting and automation tools such as Python, Bash, Terraform, Ansible, CloudFormation, Azure ARM, or Pulumi. Experience with CI/CD tools like Jenkins, GitLab CI, CircleCI, or GitHub Actions. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Strong understanding of security best practices, including permission sets and Service Control Policies (SCPs). Knowledge of SecOps technologies such as Rapid7, Splunk, or other similar tools. Knowledge of Cloud AI Services like AI Foundry or SageMaker or Bedrock is a plus Knowledge of Cloud Data Technologies like Azure Synapse, AWS Glue, Snowflake, or Databricks is a plus Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: Certifications in cloud technologies (e.g., AWS Certified Solutions Architect, Azure DevOps Engineer Expert). Experience with infrastructure as code (IaC) and configuration management. Knowledge of Service Catalogs is a plus. Exposure to AI/ML and Generative AI technologies is a plus.
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Pune
Work from Office
Role Overview:- The Senior Tech Lead - Snowflake leads the design, development, and optimization of advanced data warehousing solutions. The jobholder has extensive experience with Snowflake, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of Snowflake-based data architectures and pipelines. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Snowflake environments. Stay updated on the latest Snowflake technologies and industry trends. Key Technical Skills & Responsibilities Minimum 7 + years of experience of designing and developing data warehouse / big data applications Must be able to lead data product development using Streamlit and Cortex Deep understanding of relational as well as NoSQL data stores, data modeling methods and approaches (star and snowflake, dimensional modeling) Good communication skill. Must have experience of solution architecture using Snowflake Must have experience of working with Snowflake data platform, its utilities (SnowSQL, SnowPipe etc) and its features (time travel, support to semi-structured data etc) Must have experience of migrating on premise data warehouse to Snowflake cloud data platform Must have experience of working with any cloud platform, AWS | Azure | GCP Experience of developing accelerators (using Python, Java etc) to expedite the migration to Snowflake Must be good with Python and PySpark (including Snowpark) for data pipeline building. Must have expereince of working with streaming data sources and Kafka. Extensive experience of developing ANSI SQL queries and Snowflake compatible stored procedures Snowflake certification is preferred Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with Snowflake, SQL, and data modeling. Snowflake certification (e.g., SnowPro Core Certification). Experience with cloud platforms like AWS, Azure, or GCP. Strong understanding of ETL/ELT processes and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 3 weeks ago
14.0 - 24.0 years
32 - 45 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Cloud Data Architect ( Location- Chennai, Noida, Bangalore , Pune Hyderabad and Mumbai) Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 3 weeks ago
11.0 - 21.0 years
25 - 37 Lacs
Noida, Pune, Bengaluru
Hybrid
Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.
Posted 3 weeks ago
7.0 - 11.0 years
20 - 30 Lacs
Hyderabad, Bengaluru
Hybrid
Responsibilities includes working on MDM platforms like ETL, data modelling, data warehousing and manage database related complex analysis, design, implement and support moderate to large sized databases. The role will help in providing production support and enhance existing data assets, design and develop ETL processes. Job Description: He/She will be responsible for design and development of ETL processes for large data warehouse. Required Qualifications: Experience in Master Data Management Platform like ETL or EAI, Data warehousing concepts, code management, automated testing Experience in developing ETL design guidelines, standards and procedures to ensure a manageable ETL infrastructure across the enterprise. Strong command on MS SQL/Oracle SQL, Mongo DB, PL/SQL and Complex Data Analysis using SQL queries. Development experience in Big Data eco system with the ability to design, develop, document & architect Hadoop applications would be plus. Experience in HDFS/Hive /Spark/NoSQL Hbase Strong knowledge of data architecture concepts. Strong knowledge of reporting and analytics concepts. Knowledge of Software Engineering best practices with experience on implementing CI/CD using Jenkins Knowledge of the Agile methodology for delivering software solutions Good to have skills in SQL Server DB and Windows server file handling and Power Shell scripting. What will you do in this role? Manage and develop ETL processes for large data warehouse. Provide analysis and design reviews with other development teams to avoid duplication of efforts and inefficiency in solving the same application problem with different solutions. Work closely with business partners, business analysts and software architects to create and operationalize common data products and consumption layers. Acts as a developer in providing application design guidance and consultation, utilizing a thorough understanding of applicable technology, tools and existing designs. Develops simple or highly complex code. Verifies program logic by overseeing the preparation of test data, testing and debugging of programs.
Posted 3 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Big Data Engineer (Remote, Contract 6 Months+) ig Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop ecosystem tools. Build end-to-end data workflows for batch and streaming pipelines. Optimize data storage and retrieval processes in HBase, Hive, and other NoSQL databases. Collaborate with data scientists and business stakeholders to design robust data infrastructure. Ensure data integrity, consistency, and security in line with organizational policies. Troubleshoot and tune performance for distributed systems and applications. #MustHaveSkills in Data Engineering / Big Data Tools: Snowflake (Snowpark), Spark, MapReduce, Hadoop, Sqoop, Pig, HBase Data Ingestion & ETL, Data Pipeline Design, Distributed Computing Strong understanding of Big Data architectures & performance tuning Hands-on experience with large-scale data storage and query optimization #NiceToHave Apache Airflow / Oozie experience Knowledge of cloud platforms (AWS, Azure, or GCP) Proficiency in Python or Scala CI/CD and DevOps exposure #ContractDetails Role: Senior Big Data Engineer Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Duration: 6+ Months (Contract) Apply via Email: navaneeta@suzva.com Contact: 9032956160 #HowToApply Send your updated resume with the subject: "Application for Remote Big Data Engineer Contract Role" Include in your email: Updated Resume Current CTC Expected CTC Current Location Notice Period / Availabilit
Posted 3 weeks ago
13.0 - 22.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Experience- 13+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office- 1pm to 10PM Notice Period- immediate to 20 days Mandatory Skills- 1. Snowflake2. SQL 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. Strong communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving and decision-making abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. • SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented • Advanced SQL to create and optimize stored procedures, functions, and performance optimize • Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards • 5+ years of experience in MS SQL, Snowflake • 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools • Should be able to define and enforce Best Practices
Posted 3 weeks ago
6.0 - 10.0 years
12 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Requirement is as follows: • Design and implement scalable data storage solutions using Snowflake • Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. • Write, optimize, and troubleshoot complex SQL queries within Snowflake • Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs. • Develop and maintain ETL processes using Informatica PowerCenter • Integrate Snowflake with various data sources and third-party applications • Experience in Data Lineage Analysis, Data Profiling, ETL Design and development, Unit Testing, Production batch support and UAT support. • Involve SQL performance tuning, Root causing failures and bifurcating them into different technical issues and resolving them. • In-depth understanding of Data Warehouse, ETL concepts and Data Modelling. • Experience in requirement gathering, analysis, designing, development, and deployment. • Good working knowledge of any ETL tool (preferably Informatica powercenter, DBT) • Should have proficiency in SQL. • Have experience in client facing projects. • Have experience on Snowflake Best Practices. • Should have experience working on Unix shell scripting. • Good to have working experience in python Expected skillset: Should have 6 8 years of IT experience. Minimum 4+ years of experience in designing and implementing a fully operational solution on Snowflake Data Warehouse. Proven experience as a Snowflake and Informatica developer Strong expertise in Snowflake architecture, design, and implementation. Proficient in SQL, ETL tools, and data modelling concepts. Excellent leadership, communication, and problem-solving skills. Certifications in Snowflake and relevant cloud platforms are desirable.
Posted 3 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
Hyderabad
Work from Office
The impact you will have in this role: The Lead Platform Engineer is responsible for design analysis, documentation, testing, installation, implementation, optimization, maintenance and support for the z/OS Operating System, Third Party products, the UNIX System Services environment, Mainframe WebSphere Application Server, and WebSphere Liberty. You willcollaborate with application developers, middleware support, database administrators, and other IT professionals.Requires experience with z/OS, JES2, USS internals, SMP/E installations, and mainframe vendor product knowledge. Skills in creating and managing web sites using both common and advanced Web programming languages is advantageous. What You'll Do: Perform design analysis, documentation, testing, implementation, and support for the mainframe infrastructure environment Install and manage mainframe software deployments in a highly granular SYSPLEX environment Experience with installing, and maintaining WASz and Liberty Enhance reporting and automation using supported mainframe tools such as JCL, REXX, SAS, SQL, PYTHON and Java/JavaScript Complete assignments by due dates, without detailed supervision Responsible for Incident, Problem, and Change Management for all assigned products Ensure incidents and problems are closed according to domain standards, and all change management requirements are strictly followed Mitigates risk by following established procedures and monitoring controls, spotting key errors, and demonstrating strong ethical behavior. Participate in team on-call coverage rotation, which includes tactical systems administration and provide weekend support Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Participate in disaster recovery tests (on weekends) Actively engage in strategic goals for mainframe engineering, the department and organization Provide input and follow-through for continuous improvement to mainframe systems engineering processes and procedures Perform level 1 network troubleshooting for mainframe applications. Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Accountability: Demonstrates reliability by taking necessary actions to continuously meet required deadlines and goals. Global Collaboration: Applies global perspective when working within a team by being aware of ones own style and ensures all relevant parties are involved in key team tasks and decisions. Communication: Articulates information clearly and presents information effectively and confidently when working with others. Influencing: Convinces others by making a strong case, bringing others along to their viewpoint; maintains strong, trusting relationships, while at the same time is comfortable challenging ideas. Innovation and Creativity: Thinks > Additional Qualification: A minimum of 6+ years System Programmers experience in an IBM z/OS environment REXX programming experience preferred HTML, XML, Java, and Java Script programming experience is preferred Experience with Mainframe system automation (BMC AMI Ops) is a plus Understanding of VTAM and TCP/IP is a plus. Knowledge of Ansible, Splunk, Snowflake, ZOWE, SAS is a plus Knowledge of Bitbucket, Jira and DevOps orchestration tools are a plus Excellent written and verbal skills. The ability to multitask and work in a team environment is a must. Excellent customer service skills to be able to develop mutuallybeneficial relationships with a diverse set of customers. Knowledge of Infrastructure as Code (IaC) standards is a plus Experience in a 24x7 global environment with knowledge of system highavailability (HA), design and industry standard disaster recovery practices.
Posted 3 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Responsibilities Proven experience in test automation, ideally in a BI environment. Further development and optimization of the existing test framework (based on Azure DevOps for automation and Python) for the BI platform. Implement automated testing for data integration, data processing and reporting processes by specifying and optimizing existing CI/CD : Hands on experience in Key technologies such as Azure DevOps, Snowflake, Databricks, Sonarqube. In-depth knowledge of programming languages such as Python and SQL. Strong Experience with testing frameworks (e.g Selenium, JUnit, pytest). Experience with non-functional tests like code quality tests, open source license compliance checks. Familiarity with agile development methodologies and DevOps practices. Ability to communicate and collaborate effectively in a team. Knowledge of databases (Snowflake) and data modeling (data vault approach) are an advantage. Excellent knowledge of using Azure DevOps for test automation.
Posted 3 weeks ago
7.0 - 12.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi All We are looking for Oracle PLSQl Developer Skill: PLSQL Developer Location: Pune/Hyd/Bang/Chennai Work Mode: Hybrid Job Description : Strongly proficient in Oracle / PLSQL, - with hands on experience around 7+ years. Good analytical skills and advance knowledge on PL/SQL, exception handling, optimization. Enterprise-level technical exposure to Snowflake applications Create, test, and implement enterprise-level apps with Snowflake Provide technical design solutions. Good debugging and troubleshooting skills and domain knowledge. Good communication skill. Knowledge in Insurance domain is preferred. Must have experience working in Batch automation environment using tools such as Ctrl+M, Autosys etc Must be a self-starter and be able to start working on tasks with very little guidance. Must have excellent communication skills. Must have the ability to perform customer-facing activities in a fast-paced environment with short timeframes. Must have the ability to uncover business challenges and develop solutions to solve the challenges. Must have the ability to learn new technologies and techniques. Providing technical inputs to project If you are interested share your details with Update Cv to below email ID naveen@datawavetechnolgies.com Exp: Rel Exp in Plsql NP Official NP if serving mention Last working day: CTC ECTC Any Offers current location Preferred Location:
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Mumbai
Work from Office
#JobOpening Data Engineer (Contract | 6 Months) Location: Hyderabad | Chennai | Remote Flexibility Possible Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) #ContractDetails Role: Data Engineer Contract Duration: 6 Months Location Options: Hyderabad / Chennai (Remote flexibility available)
Posted 3 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS ,Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL,Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance)
Posted 3 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Kochi
Work from Office
Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 5 plus years of experience Must Have Snowflake, AWS, Complex SQL Experience with software architecture in cloud-based infrastructures Experience in ETL processes and data modelling techniques Experience in designing and managing large scale data warehouses Preferred technical and professional experience Good to have in addition to Must havesDBT, Tableau, python, JavaScript Develop complex SQL queries for data analytics and business intelligence. Background working with data analytics, business intelligence, or related fields
Posted 3 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 5 plus years of experience Must Have Snowflake, AWS, Complex SQL Experience with software architecture in cloud-based infrastructures Experience in ETL processes and data modelling techniques Experience in designing and managing large scale data warehouses Preferred technical and professional experience Good to have in addition to Must havesDBT, Tableau, python, javascript Develop complex SQL queries for data analytics and business intelligence. Background working with data analytics, business intelligence, or related fields
Posted 3 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Were looking for people who put their innovation to work to advance our success and their own Join an organization that ensures a more secure world through connecting and protecting our customers with inventive electrical solutions. Key Responsibilities Manage and maintain SQL Server databases to ensure optimal performance, security, and reliability. Manage and utilize Informatica PowerCenter for data integration and ETL processes, ensuring seamless data flow across systems. Implement and manage Informatica MDM solutions to ensure data consistency and accuracy across the organization. Data Cataloging: Use Informatica Data Catalog to maintain a comprehensive data inventory and ensure data accessibility and governance. Monitor and optimize database performance, including tuning SQL queries, indexing, and managing database parameters. Implement and enforce database security policies to protect sensitive data and ensure compliance with regulations. Backup and Recovery: Develop and execute database backup and recovery strategies to prevent data loss and ensure business continuity. Collaborate closely with data analysts, developers, and other stakeholders to support data-driven initiatives and projects. Create and maintain detailed documentation of database configurations, processes, and procedures. You Have A Bachelors or Masters degree in Computer Science, Engineering or a related field. Previous experience in or with 5+ years experience as a DBA or Data Engineer with a focus on SQL Servers and Informatica Tools Experience in cloud data platforms like Snowflake, AWS Informatica Tools: Proficiency in Informatica PowerCenter, Informatica MDM, and Informatica Data Catalog. Previous programming expertise in SQL, Python. Skills Ability to write, debug, and optimize SQL queries. Strong analytical skills and the ability to combine data from different sources. Excellent communication and presentation skills, with the ability to explain complex data to non-technical audiences. We Have User-facing written and verbal communication skills and experience A dynamic global reach with diverse operations around the world that will stretch your abilities, provide plentiful career opportunities, and allow you to make an impact every day nVent is a leading global provider of electrical connection and protection solutions We believe our inventive electrical solutions enable safer systems and ensure a more secure world We design, manufacture, market, install and service high performance products and solutions that connect and protect some of the world's most sensitive equipment, buildings and critical processes We offer a comprehensive range of systems protection and electrical connections solutions across industry-leading brands that are recognized globally for quality, reliability and innovation.
Posted 3 weeks ago
6.0 - 8.0 years
15 - 25 Lacs
Noida, Greater Noida, Delhi / NCR
Hybrid
Role & responsibilities SQL Dbt Python Snowflake Data Quality & Data modelling Good to have Snowpipe, Fivetran Mandatory Skills Data Engineering, Data Modeling, SQL, dbt, Python, Data Quality, snowflake Desirable Skills Data Engineering, Data Modeling, SQL, dbt, Python, Data Quality, snowflake, snowpipes, fivetran Roles & Responsibilities 1. Ensure reliable and scalable data pipelines to support healthcare operations. 2. Maintain data availability with proactive exception handling and recovery mechanisms. 3. Perform data quality checks to ensure accuracy, completeness, and consistency. 4. Detect and handle alerts early to prevent data discrepancies and processing failures. 5. Develop and optimize data models for efficient storage, retrieval, and analytics. 6. Prepare and structure data for reporting, compliance, and decision-making. 7. Work with Snowflake to manage data warehousing and performance tuning. 8. Implement and optimize DBT workflows for data transformation and governance. 9. Leverage SQL and Python for data processing, automation, and validation. 10. Experience with Snowpipe and Fivetran is a plus for automating data ingestion. Skills to be evaluated on Data Engineering-Data-ModelingPythonData QualitySQLdbtsnowflake Years Of Experience 6 to 10 Years Education/Qualification B. Tech
Posted 3 weeks ago
6.0 - 11.0 years
17 - 30 Lacs
Bengaluru
Remote
Job Summary: We are seeking an experienced .NET Core (7+) and Blazor developer to support the development of new UI screens, APIs, and reports for the MSRI Client Portal project. The role demands deep expertise in full-stack development, UI/UX, and Azure services. Work Mode: Remote work Laptop collection from office (travel/accommodation not provided) In-office presence required once every 6 months for 5 days (based on project need) Mandatory Skill Set: .NET Core (7+), Blazor with C# RESTful APIs / Web API development HTML5, CSS, Bootstrap (Responsive UI design) Azure Services: App Service, Functions, Communication Service, Blob Storage Database expertise: Azure SQL, Snowflake, MSSQL Source control & CI/CD: Git, Azure DevOps Agile methodology experience (ADO/Jira) Strong skills in code quality, unit testing (X-Unit/NUnit), and performance optimization Excellent communication and documentation skills Key Responsibilities: Design and develop high-quality UI with Blazor and .NET Core Develop APIs and integrate backend services Conduct unit testing and code reviews Prepare technical design documents (HLD/LLD) Participate in technical discussions with stakeholders Integrate code quality tools and support CI/CD pipeline Analyze backend data and contribute to reporting requirements Good to Have: Knowledge of Okta Authentication Experience with grid development using native or third-party tools
Posted 3 weeks ago
5.0 - 8.0 years
17 - 20 Lacs
Kolkata
Work from Office
Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps
Posted 3 weeks ago
5.0 - 11.0 years
15 - 20 Lacs
Gurugram
Work from Office
A Snapshot of Your Day Step into a role where you champion digital transformation and data excellence across the Middle East, Asia Pacific, and China Join a collaborative team as you drive data-driven initiatives, ensure data accuracy, and lead digitalization projects that shape the future of Compression Services Youll analyze business metrics, prepare insightful reports, and translate complex data into strategic actions Each day, youll work with regional and central teams, manages powerful data tools, and support the implementation of innovative solutions If youre ready to lead digital change, solve challenges, and make a measurable impact, this is your opportunity to shine, How Youll Make An Impact Collaborate with Regional Services Sales team, Strategy Managers, Repair center Heads, Speedboat Owners and local digitalization teams, Ability to quickly identify issues, analyze root causes, and implement effective solutions while tracking and analyzing the performance via various data points available, Strong understanding of business metrics and KPIs, with the ability to translate data into strategic insights that drive business decisions and align digital strategies with business objectives, Prepare presentations for management review meetings, Coordinate with central strategy and functional teams for data review and alignment, Ensure data accuracy and consistency across all compression service platforms, Prepare required Reports and Dashboards (Salesforce, Tableau, Snowflake, etc ), Manage data tools and coordinate with the Central team, Support and supervise data-driven and digitalization initiatives, Lead & Manage digital transformation projects to improve service efficiency withing the region, Provide support to staff on data management tools and standard methodologies, Drive the implementation of OnePM across all repair shops in the region, Maintain and update Salesforce/MIS systems with updated information and interactions, Develop and maintain Salesforce/Tableau dashboards, reports, and presentations to communicate findings to customers, Support Strategy Managers with collecting, compiling, and analyzing data from various sources to identify trends, opportunities, and risks, Support MEP exercise from CP Central team Responsible for Data Cleaning, Quality & Integrity check, in Salesforce Ensure data accuracy and integrity in all tools, Track and analyze key performance indicators (KPIs) for operational excellence and Financial metrics across the region and measure the effectiveness of implemented strategies, What You Bring You have 10 years of proven experience after MBA, in strategy development, digitalization, or data management within compression services, You hold a bachelors degree in Business, Economics, Statistics, or a related field (preferred), You are highly proficient in data analytics tools such as Salesforce, Tableau, Excel, and Power BI, You excel at collecting, organizing, and analyzing large data sets with precision and attention to detail, You bring strong project management, communication, and problem-solving skills, growing both independently and in cross-functional teams, You are a quick learner, eager to develop new skills, and experienced in creating impactful presentations for senior leadership, About The Team Youll join a dynamic, supportive team that partners closely with regional strategy, sales, and digitalization leaders Together, you drive digital transformation and data management initiatives that support business growth and operational excellence The team values open communication, analytical thinking, and collaborative problem-solving Youll have the opportunity to network with colleagues across the Compression business and have visibility within the organization Who is Siemens Energy At Siemens Energy, we are more than just an energy technology company With ~100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world's electricity generation, Our distributed team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation, Find out how you can make a difference at Siemens Energy: https:// siemens-energy , com / employeevideo Our Commitment to Diversity Lucky for us, we are not all the same Through diversity, we generate power We run on inclusion and our combined creative energy is motivated by over 130 nationalities Siemens Energy celebrates character no matter what ethnic background, gender, age, religion, identity, or disability We energize society, all of society, and we do not discriminate based on our differences, Rewards/Benefits Opportunities to work with a distributed team Opportunities to work on and lead a variety of innovative projects
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.