Jobs
Interviews

4 Etlelt Development Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

pune, maharashtra

On-site

HMH is a learning technology company committed to delivering connected solutions that engage learners, empower educators, and improve student outcomes. As a leading provider of K12 core curriculum, supplemental and intervention solutions, and professional learning services, HMH partners with educators and school districts to uncover solutions that unlock students" potential and extend teachers" capabilities. HMH serves more than 50 million students and 4 million educators in 150 countries. HMH Technology India Pvt. Ltd. is our technology and innovation arm in India focused on developing novel products and solutions using cutting-edge technology to better serve our clients globally. HMH aims to help employees grow as people, and not just as professionals. For more information, visit www.hmhco.com. The Data Integration Engineer will play a key role in designing, building, and maintaining data integrations between core business systems such as Salesforce and SAP and our enterprise data warehouse on Snowflake. This position is ideal for an early-career professional (1 to 4 years of experience) eager to contribute to transformative data integration initiatives and learn in a collaborative, fast-paced environment. Duties & Responsibilities: - Collaborate with cross-functional teams to understand business requirements and translate them into data integration solutions. - Develop and maintain ETL/ELT pipelines using modern tools like Informatica IDMC to connect source systems to Snowflake. - Ensure data accuracy, consistency, and security in all integration workflows. - Monitor, troubleshoot, and optimize data integration processes to meet performance and scalability goals. - Support ongoing integration projects, including Salesforce and SAP data pipelines, while adhering to best practices in data governance. - Document integration designs, workflows, and operational processes for effective knowledge sharing. - Assist in implementing and improving data quality controls at the start of processes to ensure reliable outcomes. - Stay informed about the latest developments in integration technologies and contribute to team learning and improvement. Required Skills and Experience: - 5+ years of hands-on experience in data integration, ETL/ELT development, or data engineering. - Proficiency in SQL and experience working with relational databases such as Snowflake, PostgreSQL, or SQL Server. - Familiarity with data integration tools such as FiveTran, Informatica Intelligent Data Management Cloud (IDMC), or similar platforms. - Basic understanding of cloud platforms like AWS, Azure, or GCP. - Experience working with structured and unstructured data in varying formats (e.g., JSON, XML, CSV). - Strong problem-solving skills and the ability to troubleshoot data integration issues effectively. - Excellent verbal and written communication skills, with the ability to document technical solutions clearly. Preferred Skills And Experience: - Exposure to integrating business systems such as Salesforce or SAP into data platforms. - Knowledge of data warehousing concepts and hands-on experience with Snowflake. - Familiarity with APIs, event-driven pipelines, and automation workflows. - Understanding of data governance principles and data quality best practices. Education: - Bachelors degree in Computer Science, Data Engineering, or a related field, or equivalent practical experience. What We Offer: - A collaborative and mission-driven work environment at the forefront of EdTech innovation. - Opportunities for growth, learning, and professional development. - Competitive salary and benefits package, including support for certifications like Snowflake SnowPro Core and Informatica Cloud certifications.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

indore, madhya pradesh

On-site

We are seeking an experienced and detail-oriented Snowflake Developer to join our Data Engineering team. As a Snowflake Developer, you will be responsible for developing, optimizing, and maintaining data pipelines and integration processes using the Snowflake cloud data warehouse, ensuring efficient data flow across our systems. RWS Technology Services India provides end-to-end business technology solutions, offering a wide portfolio of services around digital technologies and technology operations to help organizations enhance efficiencies and reduce costs. Our team specializes in state-of-the-art technology solutions across the product lifecycle management process, from consulting and design to development and optimization. Key Responsibilities: - Design, develop, and implement scalable data pipelines and ETL/ELT solutions using Snowflake. - Build and manage Snowflake objects such as tables, views, stages, file formats, streams, tasks, and procedures. - Develop SQL queries, stored procedures, and UDFs in Snowflake to meet business and reporting needs. - Integrate Snowflake with third-party tools like DBT, Fivetran, Talend, or Informatica. - Collaborate with data analysts, data scientists, and business stakeholders to deliver accurate data solutions. - Optimize Snowflake performance by tuning SQL queries, managing resource usage, and applying best practices. - Ensure data quality, security, and governance across the Snowflake environment. - Monitor and troubleshoot data pipeline issues for timely resolution. - Document architecture, designs, and processes. Skills & Experience: - 3+ years of hands-on experience with Snowflake Cloud Data Platform. - Strong expertise in SQL, data warehousing concepts, and ETL/ELT development. - Experience with cloud platforms such as AWS, Azure, or GCP. - Familiarity with data modeling techniques like dimensional and star/snowflake schema. - Proficiency in working with structured and semi-structured data (JSON, Parquet, Avro). - Experience with version control (Git) and CI/CD pipelines is a plus. - Knowledge of data security, access control, and data governance principles. - Experience with DBT (Data Build Tool). - Working knowledge of Python, Scala, or any scripting language. - Snowflake certification (e.g., SnowPro Core or Advanced). - Experience with BI tools like Tableau, Power BI, or Looker. - Bachelor's or Master's degree in Computer Science, Engineering, or related field. Benefits: - Competitive salary - Health insurance and wellness programs - Flexible working hours/Remote work options - Opportunities for learning, development, and Snowflake certifications If you are passionate about working with smart individuals to grow the value of ideas, data, and content while ensuring organizations are understood, you will enjoy life at RWS. Our values of partnership, innovation, progress, and delivery unite us to unlock global understanding and celebrate diversity. We encourage individual growth and excellence in careers while delivering exceptional outcomes and building trust with colleagues and clients.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer MS Fabric at our Chennai-Excelencia Office location, you will leverage your 4+ years of experience to design, build, and optimize data pipelines using Microsoft Fabric, Azure Data Factory, and Synapse Analytics. Your primary responsibilities will include developing and maintaining Lakehouses, Notebooks, and data flows within the Microsoft Fabric ecosystem, ensuring efficient data integration, quality, and governance across OneLake and other Fabric components, and implementing real-time analytics pipelines for high-throughput data processing. To excel in this role, you must have proficiency in Microsoft Fabric, Azure Data Factory (ADF), Azure Synapse Analytics, Delta Lake, OneLake, Lakehouses, Python, PySpark, Spark SQL, T-SQL, and ETL/ELT Development. Your work will involve collaborating with cross-functional teams to define and deliver end-to-end data engineering solutions, participating in Agile ceremonies, and utilizing tools like JIRA for project tracking and delivery. Additionally, you will be tasked with performing complex data transformations using various data formats and handling large-scale data warehousing and analytics workloads. Preferred skills for this position include a strong understanding of distributed computing and cloud-native data architecture, experience with DataOps practices and data quality frameworks, familiarity with CI/CD for data pipelines, and proficiency in monitoring tools and job scheduling frameworks to ensure the reliability and performance of data systems. Strong problem-solving and analytical thinking, excellent communication and collaboration skills, as well as a self-motivated and proactive approach with a continuous learning mindset are essential soft skills required for success in this role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer specializing in Snowflake, you will leverage your 10+ years of experience to design, build, optimize, and maintain robust and scalable data solutions on the Snowflake platform. Your expertise in cloud data warehouse principles will be utilized to collaborate with stakeholders in translating business requirements into efficient data pipelines and models. Passionate about unlocking data-driven insights, you will work with a team to drive business value through Snowflake's capabilities. Key Skills: - Proficient in Snowflake architecture and features such as virtual warehouses, storage, data sharing, and data governance. - Advanced SQL knowledge for complex queries, stored procedures, and performance optimization in Snowflake. - Experience in ETL/ELT development using Snowflake tools, third-party ETL tools, and scripting languages. - Skilled in data modelling methodologies and performance tuning specific to Snowflake. - Deep understanding of Snowflake security features and data governance frameworks. - Proficient in scripting languages like Python for automation and integration. - Familiarity with cloud platforms like Azure and data analysis tools for visualization. - Experience in version control using Git and working in Agile methodologies. Responsibilities: - Collaborate with Data and ETL team to review, improve, and maintain data pipelines and models on Snowflake. - Optimize SQL queries for data extraction, transformation, and loading within Snowflake. - Ensure data quality, integrity, and security in the Snowflake environment. - Participate in code reviews and contribute to development standards. Education: - Bachelors degree in computer science, Data Science, Information Technology, or equivalent. - Relevant Snowflake certifications (e.g., Snowflake Certified Pro / Architecture / Advanced) are a plus. If you are a proactive Senior Data Engineer with a strong background in Snowflake, eager to drive business value through data-driven insights, this full-time opportunity in Pune awaits you. Join us at Arthur Grand Technologies Inc and be a part of our dynamic team.,

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies