Gurugram, Haryana, India
Not disclosed
Remote
Full Time
Junior Data Engineer (Immediate Joiner Only) Location: Remote Working Hours: 5 PM – 2 AM IST shift MUST be ready to join in 15 days notice Max MUST have good comm skills MUST have experience between 1-2 years MAX, not more than that Pacific Data Integrators (PDI) is currently hiring JUNIOR ETL / DATA ENGINEER (GURUGRAM) for its enterprise client projects globally. The opportunity invites Entry level candidates with excellent academic record, who wants to make a great career in Data Engineering field.Do you love turning raw data into actionable insights? Are you passionate about building data pipelines that fuel innovation? At PDI, we empower businesses to unlock the full potential of their data. As a Junior Data Engineer, you'll play a crucial role in this mission by transforming raw data into valuable information that drives our two core pillars: Insight & Prediction, and Diagnostic & Description. What you'll do:Build and maintain data pipelines: You'll design, develop, and optimize data pipelines that extract, transform, and load data from various client systems into our data lake and data marts.Ensure data quality and reliability: You'll implement processes and tools to guarantee the accuracy, completeness, and consistency of our data.Collaborate with a dynamic team: You'll work closely with data scientists, architects, and client tech teams to deliver impactful data solutions.Contribute to cutting-edge projects: You'll be involved in building and maintaining data infrastructure that supports predictive modeling, descriptive analytics, and other data-driven initiatives.Stay ahead of the curve: You'll continuously learn and explore new technologies in the ever-evolving world of data engineering. What you'll bring:Foundation in data engineering: You have a good understanding of data warehousing principles, ETL processes, and data modeling techniques.Programming proficiency: You're comfortable coding in Java and Python, and you have hands-on experience with SQL database design.Cloud experience: You've worked with at least one major cloud platform (AWS, GCP, or Azure).Collaborative spirit: You thrive in a team environment and enjoy working with diverse stakeholders.Problem-solving mindset: You're a creative thinker who can identify and solve complex data challenges.Eagerness to learn: You're passionate about data and eager to expand your knowledge and skills. Bonus points:Experience with ETL tools / technologies (e.g., Informatica, Talend, Snowflake)Familiarity with data visualization tools Why PDI?Make a real impact: Your work will directly contribute to the success of our clients and help them achieve their business goals.Work with cutting-edge technology: You'll have the opportunity to use the latest tools and technologies in the data engineering field.Grow your career: We offer a supportive and collaborative environment where you can learn and develop your skills.
Gurugram, Haryana, India
None Not disclosed
Remote
Full Time
Job Title: Junior Data Engineer (Immediate Joiners Only) Location: Remote Working Hours: 5 PM – 2 AM IST (US Shift) Joining Timeline: Must be able to join within 15 days Experience Required: 0–2 Years (Candidates with more than 2 years of experience will not be considered ) Communication: Must have good English communication skills Preferred Education: Fresh Graduates from IIT/NIT or other top-tier engineering colleges About the Company: At Pacific Data Integrators, we help clients find and implement the best Data Management, Analytics and Cloud solutions to meet their business challenges. We are a service consulting provider for Informatica, AWS, Azure, Snowflake, GCP for Big Data, Cloud, Data Integration, Data Quality, and Data Security solutions. We are an Informatica Platinum Partner and implement data management solutions for growing our clients’ business. About the Role: Pacific Data Integrators (PDI) is hiring Junior Data Engineers for global enterprise projects. This is an excellent opportunity for entry-level candidates who are passionate about working with data and want to build a strong career in Data Engineering. What You’ll Do: Help build and manage data pipelines (ETL processes) Write simple, clean code in Python , Java , and SQL Support data integration and transformation from different systems Ensure data is accurate, reliable, and complete Work closely with senior engineers and data scientists Learn modern tools and technologies on real-world projects What We’re Looking For: 0–2 years of experience in Data Engineering or related areas Good understanding of SQL , Python , and Java Basic knowledge of ETL and data warehouse concepts Exposure to any cloud platform (AWS, GCP, or Azure) Strong problem-solving attitude and eagerness to learn Good communication and teamwork skills Bonus (Not Mandatory): Exposure to tools like Informatica , Talend , or Snowflake Familiarity with data visualization tools Why Join PDI? Work on real projects with global enterprise clients Learn and grow in a collaborative and supportive environment Get trained on cutting-edge tools and platforms Make an impact early in your career
Gurugram, Haryana, India
None Not disclosed
Remote
Full Time
Job Title: Senior Data Engineer (IDMC-CDI) Primary Skills: SQL, IDMC (CDI), ETL Concepts Base Location: Gurugram (Delhi/NCR) Mode of Work: Hybrid Office Work from Home option available for ideal candidates Experience: 5-8 Years About Us We are PDI (Pacific Data Integrators), a leading provider of Data Management, Analytics, and Cloud solutions. As an Informatica Platinum Partner and a reseller of various market-leading products, we help enterprises implement scalable and secure data solutions using tools like Informatica, AWS, Azure, Snowflake, and GCP . We continue to grow as a leader in data integration, data quality, and data security across industries, supporting customers in their digital transformation journeys. Job Overview We are seeking a highly skilled and experienced Informatica IDMC ETL Developer (Lead-Level) with a deep understanding of ETL design, data pipeline development, and Informatica Cloud Data Integration (CDI). The ideal candidate will have a strong background in developing, documenting, testing, and maintaining ETL applications and will take ownership in delivering scalable solutions that meet business and technical requirements. Key Responsibilities Lead the design, development, and deployment of complex ETL solutions using Informatica IDMC . Collaborate with business stakeholders, analysts, and data architects to define technical requirements and translate them into efficient ETL workflows. Develop, document, unit test, and maintain high-quality data pipelines that support both batch and incremental (CDC-based) data processing. Configure and manage: Mappings , Mapping Tasks , and Taskflows Secure Agents , runtime environments , and connection parameters Design and implement robust data transformation logic , including: Cleansing, enrichment, joins, lookups, aggregations Handling hierarchical (JSON, XML) and semi-structured data (REST APIs, Kafka) Use of parameterized mappings and reusable components Build and manage Taskflows with conditional logic, error handling, branching, and scheduling mechanisms. Optimize mapping performance through pushdown optimization , partitioning , and data minimization techniques . Provide leadership and guidance to junior developers, conduct code reviews, and ensure adherence to best practices. Troubleshoot and resolve complex data integration issues promptly. Stay updated on new features and industry trends in data integration and cloud technologies. Required Skills 5+ years of IT experience in data integration and management roles. 4+ years of hands-on experience with Informatica products . 2+ years specifically working with Informatica IDMC/CDI . Experience in leading large projects and mentoring junior developers. Proficiency with Secure Agents , runtime configuration, parameterization , and environment management. Experience integrating external applications with IDMC via batch , APIs , and message queues . Strong experience with relational databases such as Oracle, SQL Server, or MySQL. Solid understanding of data warehousing concepts (e.g., Kimball, Inmon). Excellent problem-solving skills and ability to handle complex data integration scenarios. Strong verbal and written communication skills. Experience working in Agile/Scrum environments. Preferred/Bonus Skills Experience with cloud platforms : AWS, Azure, or GCP. Exposure to data quality tools and governance frameworks. Familiarity with scripting languages like Python or Shell for automation. Why Join? Work with cutting-edge data integration tools as part of a fast-growing, elite consulting firm. Collaborate with top-tier clients and deliver impactful data management solutions. Access to ongoing training and opportunities to work with the latest in cloud , AI , and big data technologies.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.