Jobs
Interviews

1515 Talend Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: ETL Developer Location: Hyderabad (5 days WFO) Experience Required: 4+ years in ETL Developer We are looking for a talented Talend Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing team. The ideal candidate will play a key role in building and optimizing ETL/ELT data pipelines, integrating complex data systems, and ensuring high performance across cloud environments. While experience with Informatica is a plus, it is not mandatory for this role. As a Talend Developer, you will be responsible for designing, developing, and maintaining data integration solutions to meet the organization’s growing data needs. You will collaborate with business stakeholders, data architects, and other data professionals to ensure the seamless and secure movement of data across platforms, ensuring scalability and performance. Key Responsibilities: Develop and maintain ETL/ELT data pipelines using Talend Management Console on Cloud to integrate data from various on-premises and cloud-based sources. Design, implement, and optimize data flows for data ingestion, processing, and transformation in Snowflake to support analytical and reporting needs. Utilize Talend Management Console on Cloud to manage, deploy, and monitor data integration jobs, ensuring robust pipeline management and process automation. Collaborate with data architects to ensure that the data integration solutions align with business requirements and follow best practices. Ensure data quality, performance, and scalability of Talend-based data solutions. Troubleshoot, debug, and optimize existing ETL processes to ensure smooth and efficient data integration. Document data integration processes, including design specifications, mappings, workflows, and performance optimizations. Collaborate with the Snowflake team to implement best practices for data warehousing and data transformation. Implement error-handling and data validation processes to ensure high levels of accuracy and data integrity. Provide ongoing support for Talend jobs, including post-deployment monitoring, troubleshooting, and optimization. Participate in code reviews and collaborate in an agile development environment. Required Qualifications: 2+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. Experience with SQL and working with relational databases to extract and manipulate data. Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. Strong knowledge of data integration, data quality, and performance optimization in Talend. Ability to troubleshoot and resolve issues in data integration jobs and processes. Solid understanding of data modeling concepts and best practices for building scalable data pipelines. Preferred Qualifications: Experience with Informatica is a plus but not mandatory. Experience with scripting languages such as Python or Shell scripting for automation. Familiarity with CI/CD pipelines and working in DevOps environments for continuous integration of Talend jobs. Knowledge of data governance and data security practices in cloud environments.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Job Information Date Opened 07/23/2025 Job Type Permanent RSD NO 10371 Industry IT Services Min Experience 15+ Max Experience 15+ City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600018 Job Description Job Summary: We are seeking a Data Architect to design and implement scalable, secure, and efficient data solutions that support Convey Health Solutions' business objectives. This role will focus on data modeling, cloud data platforms, ETL processes, and analytics solutions, ensuring compliance with healthcare regulations (HIPAA, CMS guidelines). The ideal candidate will collaborate with data engineers, BI analysts, and business stakeholders to drive data-driven decision-making. Key Responsibilities: Enterprise Data Architecture: Design and maintain the overall data architecture to support Convey Health Solutions’ data-driven initiatives. Cloud & Data Warehousing: Architect cloud-based data solutions (AWS, Azure, Snowflake, BigQuery) to optimize scalability, security, and performance. Data Modeling: Develop logical and physical data models for structured and unstructured data, supporting analytics, reporting, and operational processes. ETL & Data Integration: Define strategies for data ingestion, transformation, and integration, leveraging ETL tools like INFORMATICA, TALEND, DBT, or Apache Airflow. Data Governance & Compliance: Ensure data quality, security, and compliance with HIPAA, CMS, and SOC 2 standards. Performance Optimization: Optimize database performance, indexing strategies, and query performance for real-time analytics. Collaboration: Partner with data engineers, software developers, and business teams to align data architecture with business objectives. Technology Innovation: Stay up to date with emerging data technologies, AI/ML applications, and industry trends in healthcare data analytics. Required Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Experience: 7+ years of experience in data architecture, data engineering, or related roles. Technical Skills: Strong expertise in SQL, NoSQL, and data modeling techniques Hands-on experience with cloud data platforms (AWS Redshift, Snowflake, Google BigQuery, Azure Synapse) Experience with ETL frameworks (INFORMATICA, TALEND, DBT, Apache Airflow, etc.) Knowledge of big data technologies (Spark, Hadoop, Data-bricks) Strong understanding of data security and compliance (HIPAA, CMS, SOC 2, GDPR) Soft Skills: Strong analytical, problem-solving, and communication skills. Ability to work in a collaborative, agile environment. Preferred Qualifications: Experience in healthcare data management, claims processing, risk adjustment, or pharmacy benefit management (PBM). Familiarity with AI/ML applications in healthcare analytics. Certifications in cloud data platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Location: Bangalore - Karnataka, India - EOIZ Industrial Area Job Family: Artificial Intelligence & Machine Learning Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-46721-2025 Description & Requirements Introduction: A Career at HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences Role : Data Architect with Microsoft Azure + Fabric + Purview Skill Experience Required: 10+ Years Key Responsibilities of the role include: Data Engineer Develop and implement data engineering project including data lakehouse or Big data platform Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Cloud native data platform experience in Microsoft Data stack including – Azure data factory, Databricks on Azure Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders. Build re-usable Methodologies, Pipelines & Models Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as Azure datalake , Synapse, Azure data factory and AWS glue , AWS Redshift and Azure SQL. Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. People & Interpersonal Skills Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What is required for the role? 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as Azure data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Data Governance experience is mandatory MS Fabric Certified Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired Educational Qualification: A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Benefits: Opportunities for professional growth and development. Collaborative and supportive work environment. What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.(www.harman.com)

Posted 1 week ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Job Title : Salesforce Architect – Data Cloud & Marketing Cloud Hiring Locations : Bangalore, Pune, Trivandrum, Kochi, Hyderabad, Chennai Experience Range Total IT Experience: 8+ years Salesforce Marketing Cloud Experience: Minimum 5 years (hands-on) Salesforce Data Cloud (CDP) Experience: Minimum 2 years Leadership Experience: Experience in leading cross-functional teams and mentoring junior architects Must Have Skills Platform Expertise Strong hands-on experience with Salesforce Data Cloud (formerly CDP): Data unification, identity resolution, calculated insights, segmentation, data streams, harmonization rules Deep hands-on expertise in Salesforce Marketing Cloud: Journey Builder, Email Studio, Mobile Studio, Automation Studio, Contact Builder Development using AMPscript, SSJS, SQL, HTML/CSS, JavaScript Integration experience using REST/SOAP APIs Data model design and audience segmentation for large-scale, multi-channel campaigns Design of real-time and batch-based data ingestion and activation flows Proven ability to translate complex business requirements into scalable Salesforce architecture Strong experience integrating Salesforce Marketing Cloud with Sales Cloud, Service Cloud, and third-party platforms Experience in delivering projects in Agile environments, including sprint planning and estimation Experience with ETL tools like MuleSoft, Informatica, or Talend Ability to create architecture diagrams, reusable frameworks, and technical documentation Awareness of data privacy laws (e.g., GDPR, CAN-SPAM) and compliance standards Good To Have Skills Experience with: Marketing Cloud Personalization (Interaction Studio) Datorama, Pardot, Social Studio AWS / GCP for data storage or event processing Familiarity with: Salesforce Administrator and Platform Developer I capabilities Salesforce Marketing Cloud Personalization Experience developing POCs and custom demos for client presentations Experience working with enterprise architecture frameworks Exposure to data governance, security models, and compliance audits Certifications Required : Salesforce Marketing Cloud Consultant Salesforce Marketing Cloud Developer Salesforce Data Cloud Consultant Nice To Have Salesforce Administrator (ADM201) Platform Developer I Marketing Cloud Personalization Specialist Key Responsibilities Architect and implement unified customer data strategies using Data Cloud Lead technical discussions and requirement-gathering with business and technical teams Design scalable multi-channel SFMC solutions for campaign execution Manage integrations with Salesforce core clouds and external systems Mentor developers, review code/designs, and ensure delivery Create documentation, standards, and best practices Ensure governance, compliance, and high delivery quality across engagements Skills Salesforce,Amp,Javascript

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled ETL Tester with hands-on experience in SQL and Python to join our Quality Engineering team. The ideal candidate will be responsible for validating data pipelines, ensuring data quality, and supporting the end-to-end ETL testing lifecycle in a fast-paced environment. Design, develop, and execute test cases for ETL workflows and data pipelines. Perform data validation and reconciliation using advanced SQL queries. Use Python for automation of test scripts, data comparison, and validation tasks. Work closely with Data Engineers and Business Analysts to understand data transformations and business logic. Perform root cause analysis of data discrepancies and report defects in a timely manner. Validate data across source systems, staging, and target data stores (e.g., Data Lakes, Data Warehouses). Participate in Agile ceremonies, including sprint planning and daily stand-ups. Maintain test documentation including test plans, test cases, and test results. Required qualifications to be successful in this role: 5+ years of experience in ETL/Data Warehouse testing. Strong proficiency in SQL (joins, aggregations, window functions, etc.). Experience in Python scripting for test automation and data validation. Hands-on experience with tools like Informatica, Talend, Apache NiFi, or similar ETL tools. Understanding of data models, data marts, and star/snowflake schemas. Familiarity with test management and bug tracking tools (e.g., JIRA, HP ALM). Strong analytical, debugging, and problem-solving skills. Good to Have: Exposure to Big Data technologies (e.g., Hadoop, Hive, Spark). Experience with Cloud platforms (e.g., AWS, Azure, GCP) and related data services. Knowledge of CI/CD tools and automated data testing frameworks. Experience working in Agile/Scrum teams. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect, and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.,

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Create data pipelines to extract, transform, and load data across systems. - Implement ETL processes to migrate and deploy data across systems. - Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: - Required Skill: Expert proficiency in Talend Big Data. - Strong understanding of data engineering principles and best practices. - Experience with data integration and data warehousing concepts. - Experience with data migration and deployment. - Proficiency in SQL and database management. - Knowledge of data modeling and optimization techniques. Additional Information: - The candidate should have a minimum of 3 years of experience in Talend ETL. - This position is based at our Hyderabad office. - A 15 years full-time education is required., 15 years full time education

Posted 1 week ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Chennai

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role You act as a contact person for our customers and advise them on data-driven projects. You are responsible for architecture topics and solution scenarios in the areas of Cloud Data Analytics Platform, Data Engineering, Analytics and Reporting. Experience in Cloud and Big Data architecture. Responsibility for designing viable architectures based on Microsoft Azure, AWS, Snowflake, Google (or similar) and implementing analytics. Experience in DevOps, Infrasturcure as a code, DataOps, MLOps. Experience in business development (as well as your support in the proposal process). Data warehousing, data modelling and data integration for enterprise data environments. Experience in design of large scale ETL solutions integrating multiple / heterogeneous systems. Experience in data analysis, modelling (logical and physical data models) and design specific to a data warehouse / Business Intelligence environment (normalized and multi-dimensional modelling). Experience with ETL tools primarily Talend and/or any other Data Integrator tools (Open source / proprietary), extensive experience with SQL and SQL scripting (PL/SQL & SQL query tuning and optimization) for relational databases such as PostgreSQL, Oracle, Microsoft SQL Server and MySQL etc., and on NoSQL like MongoDB and/or document-based databases. Must be detail oriented, highly motivated and work independently with minimal direction. Excellent written, oral and interpersonal communication skills with ability to communicate design solutions to both technical and non-technical audiences. IdeallyExperience in agile methods such as safe, scrum, etc. IdeallyExperience on programming languages like Python, JavaScript, Java/ Scala etc. Your Profile Provides data services for enterprise information strategy solutions - Works with business solutions leaders and teams to collect and translate information requirements into data to develop data-centric solutions. Design and develop modern enterprise data centric solutions (e.g. DWH, Data Lake, Data Lakehouse) Responsible for designing of data governance solutions. What you will love about working here We recognize the significance of flexible work arrangements to provide support . Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 week ago

Apply

9.0 - 12.0 years

35 - 45 Lacs

Bengaluru

Work from Office

Role & responsibilities Key Responsibilities: Leadership & Strategy Lead and mentor a team of MDM Specialists, ensuring accountability, performance, and continuous development. Own the execution of MDM strategy in India, aligned with global standards and business goals. Serve as the subject matter expert for master data across the organization, guiding stakeholders and ensuring alignment. Master Data Operations & Oversight Supervise day-to-day MDM activities, including data creation, enrichment, and lifecycle management across multiple domains. Monitor and ensure adherence to SLAs, data quality KPIs, and change request turnaround times. Oversee the execution and enforcement of data governance policies and master data standards. Cross-Functional Collaboration Act as a liaison between global business users (supply chain, finance, sales, IT) and the MDM team to resolve complex data issues. Collaborate with system owners and project managers on ERP (Oracle Fusion) rollouts, enhancements, and MDM initiatives. Partner with compliance, audit, and risk management functions to ensure regulatory readiness and reporting accuracy. System Management & Technical Expertise Manage and optimize MDM platforms and tools (e.g., Oracle Fusion, Informatica, Looker, SQL, Power BI). Drive integration efforts and support ETL processes to ensure seamless flow of high-quality master data across systems. Lead testing, validation, and implementation of new MDM workflows or platform enhancements. Data Governance & Quality Assurance Implement proactive data quality checks, root cause analysis, and remediation plans for recurring issues. Support audits, reconciliations, and control testing to ensure data integrity and compliance. Promote and maintain documentation of MDM business rules, SOPs, workflows, and user training materials. Continuous Improvement Identify and implement opportunities to streamline MDM processes through automation, templates, and AI-assisted tools. Recommend process and system enhancements based on performance metrics and stakeholder feedback. Champion a data-first culture through training, workshops, and cross-functional knowledge sharing. Required Qualifications: Bachelors degree in Information Systems, Business Administration, Data Management, or related discipline. 5+ years of experience in master data management, with at least 12 years in a leadership or team lead role. Deep understanding of ERP systems, especially Oracle Fusion, and business domain structures (finance, supply chain, customer data). Strong command of MDM principles, data governance best practices, and compliance frameworks. Advanced proficiency with SQL, Excel, Power BI, and other data analysis/reporting tools. Excellent problem-solving, stakeholder management, and cross-functional communication skills. Preferred candidate profile Experience with enterprise MDM platforms (e.g., Informatica MDM, SAP MDG, IBM InfoSphere). Familiarity with global data privacy and compliance regulations (e.g., GDPR, HIPAA). Exposure to cloud data architectures and integration platforms (e.g., SSIS, Talend, Data Lakes). Professional certifications in MDM or Data Governance (e.g. CDMP, CIMP). Experience working in a shared services or global capability center environment

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

JD Azure - Minimum experience of working in projects as a Senior Azure Data Engineer. B.Tech/B.E degree in Computer Science or Information Technology. Experience with enterprise integration tools and ETL (extract, transform, load) tools like Data Bricks, Azure Data factory, and Talend/Informatica, etc. Analyzing Data using python, Spark streaming, SSIS/Informatica Batch ETL and data base tools like SQL, Mongo DB for processing data from different sources. Experience with Platform Automation tools (DB management, Azure, Jenkins, GitHub) will be an added advantage. Design, operate, and integrate Different systems to enable efficiencies in key areas of the business Understand Business Requirements, Interacting with business users and or reverse engineering existing data products Good Understanding and working knowledge of distributed databases and Pipelines Ability to analyze and identify the root cause for technical issues. Proven ability to use relevant data analytics approaches and tools to problem solve and trouble shoot. Excellent documentation and communication skills

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

The Consulting Data Engineer role requires experience in both traditional warehousing technologies (e.g. Teradata, Oracle, SQL Server) and modern database/data warehouse technologies (e.g., AWS Redshift, Azure Synapse, Google Big Query, Snowflake), as well as expertise in ETL tools and frameworks (e.g. SSIS, Azure Data Factory, AWS Glue, Matillion, Talend), with a focus on how these technologies affect business outcomes. This person should have experience with both on-premise and cloud deployments of these technologies and in transforming data to adhere to logical and physical data models, data architectures, and engineering a dataflow to meet business needs. This role will support engagements such as data lake design, data management, migrations of data warehouses to the cloud, and database security models, and ideally should have experience in a large enterprise in these areas. Develops high performance distributed data warehouses, distributed analytic systems and cloud architectures Participates in developing relational and non-relational data models designed for optimal storage and retrieval Develops, tests, and debugs batch and streaming data pipelines (ETL/ELT) to populate databases and object stores from multiple data sources using a variety of scripting languages; provide recommendations to improve data reliability, efficiency and quality Works along-side data scientists, supporting the development of high-performance algorithms, models and prototypes Implements data quality metrics, standards, guidelines; automates data quality checks / routines as part of data processing frameworks; validates flow of information Ensures that Data Warehousing and Big Data systems meet business requirements and industry practices including but not limited to automation of system builds, security requirements, performance requirements and logging/monitoring requirements , Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 16 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

We would be looking at some of the folks who have both expert skills in Talend and Pyspark. Engineers Skillset: 1. Talend Studio 2. PySpark 3. AWS Ecosystem (S3, Glue, CloudWatch, SSM, IAM etc) 4. Redshift 5. Aurora 6. Terradata Talend Skillset: 1. Good hands-on experience with Talend Studio and Talend Management Console (TMC) 2. In-depth understanding of Joblets, PreJobs, PostJobs, and SubJobs, with the ability to go through complex designs and understand Talend's data flow and control flow logic 3. Proficient in working with Talend components for S3, Redshift, tDBInput, tMap etc and various Java-based components Pyspark: 4. Solid knowledge of PySpark with the ability to compare, analyze and validate migrated PySpark code against Talend job definitions for accurate code migration.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL. - Good To Have Skills: Experience with data integration tools and methodologies. - Strong understanding of data warehousing concepts and practices. - Familiarity with SQL and database management systems. - Experience in application testing and debugging techniques. Additional Information: - The candidate should have minimum 3 years of experience in Talend ETL. - This position is based at our Chennai office. - A 15 years full time education is required.

Posted 1 week ago

Apply

5.0 - 9.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Data Engineer role at Konex FinTech Consulting and Services Pvt Ltd , 🔹 We're Hiring: Data Engineer 📍 Location: PAN India (Client Locations) 🕒 Experience: 6 to 8 years 💼 Mode of Work: Hybrid (3 Days Work From Office) 📣 Notice Period: Immediate Joiners Preferred 💰 Budget: As per Market Standards Konex FinTech Consulting and Services Pvt Ltd is looking for a highly skilled and motivated Data Engineer to join our client’s dynamic IT team. If you have a strong command over data pipelines, cloud technologies (AWS a must), and critical communication skills – we want to hear from you! 🔧 Key Responsibilities: Design and manage scalable real-time/batch data pipelines Develop ETL processes and ensure high-quality data flows Build and maintain data lakes, warehouses, and related infrastructure Optimize systems for scalability and performance Collaborate with cross-functional teams to understand data requirements Implement data governance, privacy, and security best practices Troubleshoot data issues ensuring high system availability Document data architecture and system designs ✅ Required Skills: Bachelor’s or Master’s in Computer Science or related field Strong SQL and hands-on with Python/Java/Scala ETL tools & orchestration (AWS Glue, Airflow, Talend etc.) Cloud expertise with AWS (S3, Redshift); bonus for GCP or Azure Experience with Hadoop, Spark, Kafka Strong understanding of data warehousing and modeling Familiarity with Git, CI/CD Excellent communication and problem-solving abilities 📩 Interested candidates, please share the following details along with your resume to hr@konexcs.com by 5 pm 22nd July, 2025 • Current CTC • Expected CTC • Notice Period (If immediate, mention LWD) • Current Location • UAN Details 📞 For more information and act quickly on the request please, feel free to contact us at +91 9059132299 Join us and be a part of delivering impactful data-driven solutions! #hiring #dataengineer #AWS #ETL #datajobs #Konex #analytics #cloudengineering #immediatejoiners #ITjobs #panindia #hybridwork

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Services ETL Developer at Zeta Global, you will play a crucial role in managing and supporting the Delivery Operations Team through the implementation and maintenance of ETL and automation procedures. Your responsibilities will include processing data conversions on multiple platforms, performing tasks such as address standardization, merge purge, database updates, client mailings, and postal presort. You will be expected to automate scripts for transferring and manipulating data feeds both internally and externally. The ability to multitask and manage multiple jobs simultaneously to ensure timely client deliverability is essential. Collaborating with technical staff to maintain and support an ETL environment and working closely with database/crm, modelers, analysts, and application programmers to deliver results for clients will also be part of your role. To excel in this position, you must possess experience in database marketing and be proficient in transforming and manipulating data. Your expertise with Oracle and SQL will be crucial for automating scripts to process and manipulate marketing data. Familiarity with tools such as DMexpress, Talend, Snowflake, Sap DQM suite, Excel, and SQL Server is required for data exports, imports, running SQL server Agent Jobs, and SSIS packages. Experience with editors like Notepad++, Ultraedit, or similar tools, as well as knowledge of SFTP and PGP for ensuring data security and client data protection, will be beneficial. Working with large-scale customer databases in a relational database environment and demonstrating the ability to handle multiple tasks concurrently are key skills for this role. Effective communication and teamwork skills are essential for collaborating with colleagues to ensure tasks are completed promptly. The ideal candidate for this position should hold a Bachelor's degree or equivalent with at least 5 years of experience in Database Marketing. Strong oral and written communication skills are required to effectively fulfill the responsibilities of this role. If you are looking to be part of a dynamic and innovative company that leverages data-powered marketing technology to drive business growth, Zeta Global offers a rewarding opportunity to contribute to the success of leading brands through personalized marketing experiences.,

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Bharuch

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Surendranagar

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Mehsana

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Vadodara

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Surat

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Rajkot

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Gandhinagar

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Bhavnagar

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Jamnagar

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies