Home
Jobs

836 Talend Jobs - Page 26

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 12 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Who we are? Searce means a fine sieve & indicates to refine, to analyze, to improve. It signifies our way of working: To improve to the finest degree of excellence, solving for better every time. Searcians are passionate improvers & solvers who love to question the status quo. The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday. What are we looking for? Searce is looking for a Data Architect who is able to work with business leads, analysts, data scientists and fellow engineers to build data products that empower better decision making. Someone who is passionate about the data quality of our business metrics and geared up to provide flexible solutions that can be scaled up to respond to broader business questions. What you'll do as a Data Architect with us? Propose architectural solutions to help move and improve data from on-premise to cloud. Effectively strategize the migration of clients data using GCP or AWS Provide advisory and thought leadership on the provision of analytics environments leveraging Cloud-based platforms, big data technologies, including integration with existing data and analytics platforms and tools. Design and implement scalable data architectures leveraging Big Query, Hadoop, NoSQL and emerging technologies, covering on-premise and cloud-based deployment patterns. Compose and implement data access patterns for multiple analytical and operational workloads across Cloud-based platforms. Create information solutions covering data security, data privacy, data governance, metadata management, multi-tenancy, and mixed workload management across Spark and NoSQL platforms Delivery of customer Cloud Strategies, aligned to the clients business objectives and with a focus on Cloud Migrations ensuring global to local regulations, security, risk and compliance. Conceptual, Logical and Physical Data modeling including canonical data models (business logical models) . Implement modern data warehouses for digital natives, ISVs & Fintechs using technologies like Redshift, BigQuery etc. Analyze the requirements and migrate customers from systems such as Teradata, SQL Server, Redshift etc to Cloud data warehouses like Snowflake, BigQuery, etc. Provide Consulting and Solution support to customers during their data warehouse or data lake modernization, especially in design and implementation of data-centric architectures. Proactively identify niche opportunity areas within the data framework, and drive client discussions delivering presentations, demos and proofs-of-concept to showcase capabilities and transformation solutions. Manage team and handle delivery of 3-4 projects What are the must-haves to join us? Is Education overrated? Yes. We believe so. But there is no way to locate you otherwise. So we might look for at least a Bachelors degree in Computer Science and Over 8 years of experience with building/designing the data pipelines or data ingestion for both Batch/Streaming data from different sources to data warehouse / data lake Experience leading and delivering data warehousing and analytics projects, including using cloud technologies such as EMR, Lambda, Cloud Storage, BigQuery, etc. Experience working on cloud platforms (AWS/GCP) and its architecture for data solutions Hands-on experience with related/complementary open-source software platforms and languages (e.g. Java, Apache, Python, Scala). Hands-on experience with ETL or ELT tools (Informatica, Talend, Pentaho, Business Objects Data Services, Hevo) Experience in working closely with Data Scientists / Analysts as well as business users to design and deliver complex data engineering solutions Outstanding analytical and consulting skills Ability to lead customer discussions independently. Prior experience in recruitment, training & grooming of geeks. Great to have certifications - GCP and/or AWS, professional level Your contributions to the community - tech blogs, stackoverflow etc. Strong communication and presentation skills to communicate across a diverse audience with varying levels of business and technical expertise. Ability to work independently and as a team leader, establishing strategic objectives, project plans, and milestones

Posted 2 months ago

Apply

3 - 5 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidate should have hands-on development and architecture design experience with Talend. Experience with Talend Data Integration using ESB Hands-on experience in core java and RESTFUL services. Hands-on experience in designing and creating jobs , routes Experience in Orchestration of Talend jobs Experience in SOAP/REST web services implementation in Talend Should possess hands on experience in Design, Development and CI/CD release. Experience in developing transformations with different file formats like Positional, JSON, XML, CSV, Flat-file Experience in working with Database components in Talend like Oracle, MySQL etc. Knowledge of various normalization techniques, Joins, Indexes Knowledge of source code repositories like svn, git etc Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences.

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Data Analyst Manage and support the Delivery Operations Team by implementing and supporting ETL and automation procedures. Schedule and perform delivery operations functions to complete tasks and ensure client satisfaction. ESSENTIAL FUNCTIONS: Process data conversions on multiple platforms Perform address standardization, merge purge, database updates, client mailings, postal presort. Automate scripts to perform tasks to transfer and manipulate data feeds internal and external. Multitask ability to manage multiple Jobs to ensure timely client deliverability Work with technical staff to maintain and support an ETL environment. Work in a team environment with database/crm, modelers, analysts and application programmers to deliver results for clients. REQUIRED SKILLS: Experience in database marketing with the ability to transform and manipulate data. Experience with Oracle and SQL to automate scripts to process and manipulate marketing data. Experience with tools such as DMexpress, Talend, Snowflake, Sap DQM suite of tools, excel. Experience with Sql Server : Data exports and imports, ability to run Sql server Agent Jobs and SSIS packages. Experience with editors like Notepad++, Ultraedit, or any type of editor. Experience in SFTP and PGP to ensure data security and protection of client data. Experience working with large scale customer databases in a relational database environment. Proven ability to work on multiple tasks at a given time. Ability to communicate and work in a team environment to ensure tasks are completed in a timely manner MINIMUM QUALIFICATIONS: Bachelor's degree or equivalent 5+ years experience in Database Marketing. Excellent oral and written communication skills required.

Posted 2 months ago

Apply

10 - 19 years

22 - 37 Lacs

Pune

Hybrid

Naukri logo

Lead& manage global team of data migration experts,providing strategic direction &professional develop Develop& maintain comprehensive data migration methodologie Design &implement robust data migration strategies& Collaborate with solution architect Required Candidate profile Data migration tools (e.g., Informatica, Talend, Microsoft SSIS) Exp.- customer information system (CIS)/billing system migrations Data governance frameworks & Utility industry data model

Posted 2 months ago

Apply

5 - 9 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for Talend Developer with strong ETL experience for data transformation and automation tasks. Develop, and maintain ETL pipelines using Talend Data Integration, ensuring efficient data extraction, transformation, and loading into PostgreSQL or other databases. Should be proficient in optimizing ETL workflows, handling large datasets, and troubleshooting performance issues. A basic understanding of Python scripting or any other languages will be an added advantage.

Posted 2 months ago

Apply

2 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: Data Engineer Position Details Position Title: Data Engineer Department: Data Engineering Location: Chennai Employment Type: Full-Time About the Role We are seeking a highly skilled and motivated Data Engineer with expertise in Snowflake to join our dynamic team. In this role, you will design, build, and optimize scalable data pipelines and cloud-based data infrastructure to ensure efficient data flow across systems. You will collaborate closely with data scientists, analysts, and business stakeholders to provide clean, accessible, and high-quality data for analytics and decision-making. The ideal candidate is passionate about cloud data platforms, data modeling, and performance optimization , with hands-on experience in Snowflake and modern data engineering tools . Key Responsibilities 1. Data Pipeline Development & Optimization Design, develop, and maintain scalable ETL/ELT data pipelines using Snowflake, dbt, and Apache Airflow . Optimize Snowflake query performance, warehouse sizing, and cost efficiency . Automate data workflows to ensure seamless integration between structured and unstructured data sources. 2. Data Architecture & Integration Design and implement data models and schemas optimized for analytics and operational workloads. Manage Snowflake multi-cluster warehouses, role-based access controls (RBAC), and security best practices . Integrate data from multiple sources, including APIs, relational databases, NoSQL databases, and third-party services . 3. Infrastructure & Performance Management Monitor and optimize Snowflake storage, query execution plans, and resource utilization . Implement data governance, security policies, and compliance within Snowflake. Troubleshoot and resolve performance bottlenecks in data pipelines and cloud storage solutions . 4. Collaboration & Continuous Improvement Work with cross-functional teams to define data requirements and ensure scalable solutions . Document technical designs, architecture, and processes for data pipelines and Snowflake implementations . Stay updated with the latest advancements in cloud data engineering and Snowflake best practices . Qualifications Education & Experience Bachelors or Master’s degree in Computer Science, Information Technology, Engineering , or a related field. 2+ years of experience in data engineering , with a strong focus on cloud-based data platforms . Proven expertise in Snowflake , including performance tuning, cost management, and data sharing capabilities . Experience working with cloud platforms (AWS, GCP, or Azure) and distributed computing frameworks (Spark, Hadoop, etc.) . Technical Skills Strong SQL skills for query optimization and data modeling in Snowflake. Experience with ETL tools such as Apache Airflow, dbt, Talend, Informatica, or Matillion . Proficiency in Python, Scala, or Java for data processing and automation . Familiarity with Kafka, Kinesis, or other streaming data solutions . Understanding of data warehousing concepts, partitioning, and indexing strategies . Preferred Qualifications SnowPro Certification or an equivalent cloud data engineering certification . Experience with containerization (Docker, Kubernetes) and CI/CD for data workflows . Knowledge of machine learning pipelines and MLOps . Benefits Competitive salary and performance-based bonuses . Health insurance . Flexible working hours and remote work options . Professional development opportunities , including Snowflake training, certifications, and conferences . Collaborative and inclusive work environment . How to Apply Follow these steps to apply for the Data Engineer position: 1. Submit Your Resume/CV Ensure your resume is updated and highlights your relevant skills, experience, and achievements in Data Engineering. 2. Write a Cover Letter (Optional but Recommended) Your cover letter should include: Why you are interested in the role. Your relevant experience and achievements in data engineering . How your skills align with the job requirements . 3. Provide Supporting Documents (Optional but Recommended) Links to GitHub repositories, research papers, or portfolio projects showcasing your work in data engineering. If you don’t have links, you can attach files (e.g., PDFs) of your projects or research papers. 4. Send Your Application Email your resume, cover letter, and supporting documents to: krishnamoorthi.somasundaram@nulogic.io

Posted 2 months ago

Apply

6 - 8 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Minimum of 6 years of experience in Talend + Snowflake Capacity to work with onsite resources Ability to work as one team mode (part of team and clients are on remote- multinational environment) Ability to work in multiple simultaneous projects Have clear communication with all stakeholder Good experience in Talend Suit Good experience in Snowflake Stored Procedures Very good experience in SQL (SnowFlake desired) Good knowledge on data modelling and data transformation [ business layer]

Posted 2 months ago

Apply

5 - 9 years

2 - 5 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Snowflake developer with DBT experience and Data Modeling , Python. Strong in Implementation An individual contributor with effective communicaiton skills

Posted 2 months ago

Apply

10 - 14 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Technical Skills Candidate should have 10+ years of experience Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration.

Posted 2 months ago

Apply

7 - 9 years

10 - 15 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a highly skilled Senior Snowflake Developer with expertise in Python, SQL, and ETL tools to join our dynamic team. The ideal candidate will have a proven track record of designing and implementing robust data solutions on the Snowflake platform, along with strong programming skills and experience with ETL processes. Key Responsibilities: Designing and developing scalable data solutions on the Snowflake platform to support business needs and analytics requirements. Leading the end-to-end development lifecycle of data pipelines, including data ingestion, transformation, and loading processes. Writing efficient SQL queries and stored procedures to perform complex data manipulations and transformations within Snowflake. Implementing automation scripts and tools using Python to streamline data workflows and improve efficiency. Collaborating with cross-functional teams to gather requirements, design data models, and deliver high-quality solutions. Performance tuning and optimization of Snowflake databases and queries to ensure optimal performance and scalability. Implementing best practices for data governance, security, and compliance within Snowflake environments. Mentoring junior team members and providing technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience working with Snowflake data warehouse. Strong proficiency in SQL with the ability to write complex queries and optimize performance. Extensive experience developing data pipelines and ETL processes using Python and ETL tools such as Apache Airflow, Informatica, or Talend. Strong Python coding experience needed minimum 2 yrs Solid understanding of data warehousing concepts, data modeling, and schema design. Experience working with cloud platforms such as AWS, Azure, or GCP. Excellent problem-solving and analytical skills with a keen attention to detail. Strong communication and collaboration skills with the ability to work effectively in a team environment. Any relevant certifications in Snowflake or related technologies would be a plus.

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As an entry level Package Consultant at IBM, you will help to assist clients in the selection, implementation, and production support of application packaged solutions, such as SAP, Oracle, Salesforce, Microsoft Dynamics, Workday, or SharePoint solution suite, to meet client needs. Leveraging a growth mindset, you're ready and willing to deliver business value, wherever needed. In your role, you may be responsible for: Assisting clients in selection, implementation, and support of package Make strategic recommendations and leverage business knowledge to drive solutions for clients and their management Run or support workshops, meetings, and stakeholder interviews Develop process maps to understand As-Is and To-Be scenarios Use IBM's Design Thinking to help solve client's challenges Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities Preferred technical and professional experience Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the optimization of Ataccama data management solutions. Collaborate with stakeholders to gather requirements Design data quality, data governance, and master data management solutions using Ataccama. Develop and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Center Preferred technical and professional experience Hiring manager and Recruiter should collaborate to create the relevant verbiage.

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your primary responsibilities include: Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your primary responsibilities include: Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As an entry level Package Consultant at IBM, you will help to assist clients in the selection, implementation, and production support of application packaged solutions, such as SAP, Oracle, Salesforce, Microsoft Dynamics, Workday, or SharePoint solution suite, to meet client needs. Leveraging a growth mindset, you're ready and willing to deliver business value, wherever needed. In your role, you may be responsible for: Assisting clients in selection, implementation, and support of package Make strategic recommendations and leverage business knowledge to drive solutions for clients and their management Run or support workshops, meetings, and stakeholder interviews Develop process maps to understand As-Is and To-Be scenarios Use IBM's Design Thinking to help solve client's challenges Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities Preferred technical and professional experience Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes

Posted 2 months ago

Apply

3 - 6 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

As an entry level Package Consultant at IBM, you will help to assist clients in the selection, implementation, and production support of application packaged solutions, such as SAP, Oracle, Salesforce, Microsoft Dynamics, Workday, or SharePoint solution suite, to meet client needs. Leveraging a growth mindset, you're ready and willing to deliver business value, wherever needed. In your role, you may be responsible for: Assisting clients in selection, implementation, and support of packages Make strategic recommendations and leverage business knowledge to drive solutions for clients and their management Run or support workshops, meetings, and stakeholder interviews Develop process maps to understand As-Is and To-Be scenarios Use IBM's Design Thinking to help solve client's challenges Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities Preferred technical and professional experience Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes

Posted 2 months ago

Apply

5 - 10 years

10 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

Job Summary: We are seeking a skilled Qlik Sense & Talend Developer to manage end-to-end data integration, transformation, and visualization. The ideal candidate will have expertise in Qlik Sense for BI & data visualization and Talend for ETL & data integration , ensuring seamless data flow and insightful analytics for business decision-making. Key Responsibilities: Qlik Sense Development Design and develop interactive dashboards & reports Create and optimize data models & scripting Implement set analysis & advanced expressions Connect and integrate data from multiple sources (SQL, APIs, Cloud, etc.) Manage Qlik Sense Server & QMC administration Talend Development (ETL & Data Integration) Design, develop, and maintain ETL pipelines using Talend Perform data extraction, transformation, and loading from various sources Optimize job scheduling, performance tuning, and error handling Ensure data quality, governance, and security General Responsibilities Collaborate with business analysts, data engineers, and stakeholders Optimize and improve data workflows and BI solutions Troubleshoot performance issues and data discrepancies Maintain documentation of data processes and solutions Required Skills & Experience: 3-8 years of experience in Qlik Sense & Talend Strong expertise in Qlik Sense BI, dashboarding, and scripting Hands-on experience with Talend ETL tools (Open Studio, Data Fabric, etc.) Proficiency in SQL, APIs, and data warehousing concepts Knowledge of cloud platforms (AWS, Azure, or GCP) is a plus Good understanding of data governance & quality standards

Posted 2 months ago

Apply

10 - 14 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. Responsibilities 1. Integration Strategy & Architecture Define the enterprise integration strategy , aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. 2. Technology Selection & Implementation Evaluate and recommend the right integration technologies , such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. 3. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards ( OAuth, JWT, OpenID Connect, API Gateway ). Establish API versioning, governance, and lifecycle management. 4. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. 5. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications . Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday . 6. Security & Compliance Ensure secure integration practices , including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations ( GDPR, HIPAA, SOC 2 ). 7. Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. 8. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration. Provide technical guidance to developers, architects, and integration engineers. Qualifications Technical Skills Candidate should have 10+ years of experience Expertise in Integration Platforms: Informatica, TIBCO, MuleSoft, WSO2, Dell Boomi Strong understanding of API Management & Microservices Experience with Enterprise Messaging & Streaming (Kafka, RabbitMQ, IBM MQ, Azure Event Hub) Knowledge of ETL & Data Pipelines (Informatica, Talend, Apache NiFi, AWS Glue) Experience in Cloud & Hybrid Integration (AWS, Azure, GCP, OCI) Hands-on with Security & Compliance (OAuth2, JWT, SAML, API Security, Zero Trust) Soft Skills Strategic Thinking & Architecture Design Problem-solving & Troubleshooting Collaboration & Stakeholder Management Agility in Digital Transformation & Cloud Migration

Posted 2 months ago

Apply

5 - 9 years

5 - 9 Lacs

Maharashtra

Work from Office

Naukri logo

Description Demand for an experienced Integration Engineer with expertise in WSO2 Choreo and Talend. **Primary Skil- WSO2 API Manager (Choreo), Talend Data Integration** **Secondary Skill- Azure** This role will involve the design, development, and maintenance of integration solutions that leverage the capabilities of both WSO2 Choreo (a cloud-native integration platform) and Talend (an open-source data integration tool). The ideal candidate will be responsible for building seamless integration workflows, managing data transformations, and ensuring robust data flows between various systems. Key Responsibilities Integration Solution DesignDesign and develop integration workflows and APIs using WSO2 Choreo and Talend to integrate with diverse systems, databases, and services. Data ManagementImplement and manage ETL (Extract, Transform, Load) processes using Talend, ensuring high-quality data movement between different environments. API Development ManagementLeverage WSO2 Choreo to create, manage, and monitor APIs, ensuring secure and scalable integration solutions. Cloud Hybrid IntegrationArchitect cloud-based and hybrid integration solutions that utilize both WSO2 Choreos cloud-native capabilities and Talends cloud data integration features. Troubleshooting OptimizationTroubleshoot and resolve integration issues related to data flow, API execution, and performance bottlenecks. Documentation Best PracticesDocument integration processes, best practices, and procedures to ensure consistency across all integration projects. CollaborationWork with cross-functional teams. Continuous ImprovementStay current with new features and functionalities of WSO2 Choreo and Talend, continuously improving the integration solutions and recommending enhancements. Required Skills Qualifications Technical ExpertiseHands-on experience with WSO2 Choreo, including API creation, deployment, and monitoring. Proficiency in Talend (Studio, Cloud, Data Integration), including designing, developing, and maintaining ETL processes. Experience with cloud platforms such as AWS, Azure, or GCP. Strong knowledge of API-based architectures and RESTful services. Experience with Microservices and Containers (Docker, Kubernetes) is a plus. Programming LanguagesProficient in languages like Java, JavaScript, SQL, and Python for integration and transformation tasks. Data ManagementStrong understanding of data integration, data pipelines, data warehousing, and transformation. Knowledge of relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB). Integration Tools FrameworksExperience with tools like MuleSoft is a plus. Knowledge of CI/CD pipelines for integration processes. Cloud-Native DevelopmentHands-on experience with cloud-based integration platforms and services. Problem-Solving Analytical Skills: Ability to troubleshoot complex integration issues across different platforms and tools. CommunicationStrong written and verbal communication skills to create documentation and collaborate with teams. Preferred Qualifications Experience with WSO2 API Manager, Talend Data Integration Knowledge of DevOps practices and experience with version control systems like Git. Experience in Agile methodologies and working in Agile teams. Certification in WSO2 or Talend technologies is a plus. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family To be defined Local Role Name To be defined Local Skills WSO2 API Manager;Talend Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

4 - 9 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Hiring Talend Engineers! Exp-4-10 years Location-Hyderabad/Permanent Hybrid Mode Role & responsibilities We are seeking a Data Engineer responsible for developing and implementing the IT technology strategy for the Mortgage Servicing team. This role involves collaborating with business and product teams to design, build, and deliver processes that drives our business and technology objectives within the Mortgage industry. The Data Engineer will focus on executing key initiatives and consolidating research and technologies into a unified technical roadmap to create innovative solutions for both organization and customers. Skill required: 4+ years of experience in IT. Talend engineers preferably with snowflake experience. Experience in ETL processes and Data modelling. Understanding of IT concepts and methodologies. Communication: Strong communication skills to collaborate with cross-functional teams. Problem-Solving: Excellent problem-solving skills and attention to detail

Posted 2 months ago

Apply

1 - 6 years

9 - 13 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Data Management - Talend Designation: Data Eng, Mgmt & Governance Sr Analyst Qualifications: Any Graduation Years of Experience: 2 years Language - Ability: English(International) - Expert What would you do? In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. As a Data Quality Sr. Analyst, you are analytical with a passion for leveraging data to solve complex problems. In this role, you will help support data gathering requirements, analyze data quality issues, and implement solutions to enhance the overall quality of data for reporting and analytics purposes. What are we looking for? A bachelor's degree or equivalent experience Minimum of 2 years of experience in data quality or data management Minimum of 1 year of experience with cleaning and standardizing data to ensure accuracy Minimum of 1 year of experience with Data profiling tools like Talend, Informatica, or Trifacta Data Quality Tools (e.g., Informatica Data Quality (IDQ), Collibra, IBM Infosphere, Talend, Qlik, Precisely Data Quality 360 etc.,) Data Profiling Tools Bachelor's degree Information Technology or Data Management Certification in data quality management or data governance a plus Critical thinking skills to analyze data to identify patterns, anomalies, and potential issues Basic knowledge of statistical concepts to interpret data accurately Proficiency in data profiling and cleansing techniques Familiarity with data governance and compliance standards Attention to detail and strong organizational skills Demonstrated experience implementing data quality processes Roles and Responsibilities: Conduct data quality assessments and audits Implement data cleansing strategies to rectify identified data quality issues and discrepancies Profile and assess the quality of data by utilizing advanced tools and techniques to analyze data quality characteristics such as completeness, accuracy, consistency, and timeliness Collaborate with data owners to establish data quality standards Work with IT and business teams to address underlying issues and implement corrective actions Prepare and present reports on data quality metrics, trends, and issues to management and stakeholders Qualifications Any Graduation

Posted 2 months ago

Apply

5 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title LIMS Responsibilities At least 4-8 years of experience in LIMS(LV/LW) – Implementation /Configuration/Customization using Java, Java script. integration with Lab applications, and should have implemented at least 2-3 projects with role involving development using LabVantage platform and Jasper/iReport/Java reporting tool Interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation and translating to functional and/or design documentation for LabVantage LIMS solution, Application Architecture definition and Design, Development, Validation and release. Technical and Professional Requirements: Primary skills:Technology->Life Sciences->LIMS Experience in developing instrument drivers using SDMS/Talend/Java is to have. At least 5 years of experience in software development life cycle. At least 5 years of experience in Project life cycle activities on development and maintenance projects. At least 5 years of experience in Design and architecture review. Good understanding of sample management domain and exposure to life sciences projects Ability to work in team in diverse/ multiple stakeholder environment Analytical skills Very Good Communication skills Preferred Skills: Technology->Life Sciences->LIMS Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Master Of Engineering,Master of Pharmacy,MCA,MTech,Bachelor of Pharmacy,Bachelor of Engineering,BSc,BTech Service Line Engineering Services * Location of posting is subject to business requirements

Posted 2 months ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies