Jobs
Interviews

1489 Talend Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

5 - 18 Lacs

India

Remote

Role: Senior Appian Developer (Hybrid) Position Type: Full-Time Contract (40hrs/week) Contract Duration: Long Term Work Schedule: 8 hours/day (Mon-Fri) Location: Hyderabad, India - Hybrid (3 days/week on site) What You'll Do: Troubleshoot and resolve technical issues related to Appian applications, ensuring minimal downtime and optimal performance. Diagnose and fix problems in Talend workflows, focusing on data extraction, transformation, and loading processes. Manage and troubleshoot SQL Server databases, ensuring data integrity, performance, and security Working knowledge of automations built on Power Automate from troubleshooting and maintenance perspective Handle Autosys job scheduling and automation, ensuring smooth execution of batch jobs and workflows. Collaborate with cross-functional teams to gather requirements, design solutions, and implement troubleshooting strategies. Document and track issues, resolutions, and best practices to improve the overall troubleshooting process. Provide technical support during production releases and maintenance windows, working closely with the Operations team. Stay up-to-date with the latest industry trends and best practices in troubleshooting and technical support. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Talents Needed for Success: Minimum of 7 years of experience in technical troubleshooting and support. Proven experience in troubleshooting Appian applications, with a strong understanding of Appian architecture and integration patterns. Expertise in Talend, including designing and troubleshooting ETL processes. Proficiency in SQL Server, including database design, optimization, and performance tuning. Experience with Autosys job scheduling and automation, including setting up and managing jobs using Autosys. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills: Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Knowledge of scripting languages such as Python and Shell/Batch programming is a plus. Understanding of Agile processes and methodologies, with experience in working in an Agile framework using Scrum. Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹543,352.07 - ₹1,855,655.80 per year Application Question(s): How many years of experience do you have of troubleshooting in Appian applications? How much experience do you have inTalend, including designing and troubleshooting ETL processes? How much Experience do you have with Autosys job scheduling and automation? Are you comfortable to work 3 days onsite and 2 days remote in a week? How soon you can join us? License/Certification: Appian L2 certification (Required) Location: Hyderabad Jubilee Ho, Hyderabad, Telangana (Required)

Posted 18 hours ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands-on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work effectively within a global team environment. Excellent communication skills. Good to have - Production support experience. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 18 hours ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibilities: Design, build, and maintain scalable data pipelines on Snowflake. Possessing experience or knowledge in Snow pipe, Time Travel, and Fail Safe. Write and optimize SQL queries for data extraction and transformation. Develop ETL processes to integrate various data sources into Snowflake. Monitor and troubleshoot data warehouse performance issues. Implement security measures and data governance practices. Having sound knowledge on snowflake architecture. Having knowledge on fivetran is addon advantage Collaborate with cross-functional teams to support analytical and reporting needs. Experience : 2 to 8 Years Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience with Snowflake and data warehousing concepts. Proficiency in SQL and ETL tools (e.g., Talend, Informatica, etc.). Company Details: One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week Location: Ahmedabad/ Pune/ Vadodara

Posted 19 hours ago

Apply

2.0 - 6.0 years

0 - 0 Lacs

chennai, pune

On-site

Key Responsibilities Perform end-to-end ETL testing including data extraction, transformation, and loading validation Design and implement automation scripts for ETL testing using tools like Python, Selenium, or equivalent frameworks Validate data flow, transformation logic, and data integrity across systems Develop test cases, test plans, and test scripts based on business requirements Collaborate with data engineers, developers, and business analysts to ensure high-quality data delivery Analyze and troubleshoot data discrepancies and report defects with detailed documentation Skills Required 23 years of experience in ETL Testing with automation capabilities Strong knowledge of SQL and database validation Experience with ETL tools (e.g., Informatica, Talend, SSIS) Hands-on experience in test automation using Python, Selenium, or similar frameworks Understanding of data warehousing concepts and data modeling Excellent analytical and debugging skills To Apply Walk-in / Contact: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact: 7996827671

Posted 1 day ago

Apply

2.0 - 6.0 years

0 - 0 Lacs

bangalore, pune

On-site

Key Responsibilities Design, develop, and execute ETL test plans, test cases, and test scripts Perform data validation and transformation logic checks Validate data movement across source, staging, and target systems Use Python scripts for automation and data testing Collaborate with development and business teams to ensure quality delivery Log, track, and verify resolution of issues and bugs Skills Required 23 years of experience in ETL testing and data warehouse concepts Strong knowledge of SQL for data validation and querying Hands-on experience with Python scripting for test automation Good understanding of ETL tools (like Informatica, Talend, etc. if any) Ability to handle large volumes of data and complex data mapping scenarios Good communication and problem-solving skills To Apply Walk-in / Contact: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact: 9632024646

Posted 1 day ago

Apply

4.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JOB RESPONSIBILITIES: The job entails you to work with our clients and partners to design, define, implement, roll-out, and improve Data Quality that leverage various tools available in the market for example: Informatica IDQ or SAP DQ or SAP MDG or Collibra DQ or Talend DQ or Custom DQ Solution and/or other leading platform for the client’s business benefit. The ideal candidate will be responsible for ensuring the accuracy, completeness, consistency, and reliability of data across systems. You will work closely with data engineers, analysts, and business stakeholders to define and implement data quality frameworks and tools. As part of your role and responsibilities, you will get the opportunity to be involved in the entire business development life-cycle:  Meet with business individuals to gather information and analyze existing business processes, determine and document gaps and areas for improvement, prepare requirements documents, functional design documents, etc. To summarize, work with the project stakeholders to identify business needs and gather requirements for the following areas: Data Quality and/or Data Governance or Master Data  Follow up of the implementation by conducting training sessions, planning and executing technical and functional transition to support team.  Ability to grasp business and technical concepts and transform them into creative, lean, and smart data management solutions.  Development and implementation of Data Quality solution in any of the above leading platform-based Enterprise Data Management Solutions o Assess and improve data quality across multiple systems and domains. o Define and implement data quality rules, metrics, and dashboards. o Perform data profiling, cleansing, and validation using industry-standard tools. o Collaborate with data stewards and business units to resolve data issues. o Develop and maintain data quality documentation and standards. o Support data governance initiatives and master data management (MDM). o Recommend and implement data quality tools and automation strategies. o Conduct root cause analysis of data quality issues and propose remediation plans. o Implement/Take advantage of AI to improve/automate Data Quality solution o Leveraging SAP MDG/ECCs experience the candidate is able to deep dive to do root cause analysis for assigned usecases. Also able to work with Azure data lake (via dataBricks) using SQL/Python. o Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to monitor on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective o Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed important for each individual business. o Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with managing implementation timelines. o Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs o Identifies business DQ rules basis which KPIs/Measures are stood up that feed into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated o Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality KPIs/Measures is needed. Also has experience owing and executing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives o Highlights risk/hidden DQ issues to Lead/Manager for further guidance/escalation o Communication skills are important in this role as this is outward facing and focus has to be on clearly articulation messages. o Support designing, building and deployment of data quality dashboards via PowerBI o Determines escalation paths and constructs workflow and alerts which notify process and data owners of unresolved data quality issues o Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) o Works with business functions and projects to create data quality improvement plans o Sets targets for data improvements / maturity. Monitors and intervenes when sufficient progress is not being made o Supports initiatives which are driving data clean-up of existing data landscape JOB REQUIREMENTS: i. Education or Certifications:  Bachelor's / Master's degree in engineering/technology/other related degrees.  Relevant Professional level certifications from Informatica or SAP or Collibra or Talend or any other leading platform/tools  Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience:  You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Quality (DQ) and/or Data Governance or Master Data using relevant tools  You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools  Client-facing Consulting experience will be considered a plus iii. Technical and Functional Skills:  Hands-on experience in any of the above DQ tools in the area of enterprise Data Management preferably in complex and diverse systems environments  Exposure to concepts of data quality – data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization, enrichment using 3 rd party plugins etc.) etc.  Strong understanding of data quality best practices, concepts, data quality management frameworks and data quality dimensions/KPIs  Deep knowledge on SQL and stored procedure  Should have strong knowledge on Master Data, Data Governance, Data Security  Prefer to have domain knowledge on SAP Finance modules  Good to have hands on experience on AI use cases on Data Quality or Data Management areas  Prefer to have the concepts and hands on experience of master data management – matching, merging, creation of golden records for master data entities  Strong soft skills like inter-personal, team and communication skills (both verbal and written)

Posted 1 day ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

🚨 Urgent Hiring: Python Developer - Full time 💼 Immediate Joiners Only (0-15 Days NP) 🚨 📍 Location: Chennai & Pune (Preferred) 💼 Experience: 6+Yrs 💰 CTC: Up to ₹21 LPA 🕒 Join Within: Next 5 Days We're looking for Python Developers with strong experience in data warehousing applications, ideally with: ✅ 3-4 yrs Python (Pandas, Polars, etc.) ✅ 1-2 yrs Talend ETL tool (flexible) ✅ 1-2 yrs SQL/PLSQL development ⚡ Immediate joiners or serving notice period (0–15 days) only! 📩 DM me or share profiles ASAP. !: rajesh@reveilletechnologies.com./

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Talend Developer with expertise in SQL (SSQL), you will be an integral part of our banking client's data integration team. Your primary responsibilities will include designing, developing, and optimizing ETL workflows using Talend as well as crafting complex SQL queries for data transformation and reporting purposes. To excel in this role, it is crucial to have experience working with financial data systems and possessing the ability to fine-tune performance for optimal results. Additionally, strong analytical capabilities and effective communication skills are essential for successful collaboration within the team and with stakeholders. Please note that the selection process involves a single in-person interview round, which will be conducted in Hyderabad.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

You have experience in ETL testing and are familiar with Agile methodology. With a minimum of 4-6 years of testing experience in test planning & execution, you possess working knowledge in Database testing. It would be advantageous if you have prior experience in the auditing domain. Your strong application analysis, troubleshooting, and behavioral skills along with extensive experience in manual testing will be valuable. While experience in Automation scripting is not mandatory, it would be beneficial. You are adept at leading discussions with Business, Development, and vendor teams for testing activities such as Defect Coordinator and test scenario reviews. Your excellent verbal and written communication skills enable you to effectively communicate with various stakeholders. You are capable of working independently and collaboratively with onshore and offshore teams. The role requires an experienced ETL developer with proficiency in Big Data technologies like Hadoop. Key Skills Required: - Hadoop (Horton Works), HDFS - Hive, Pig, Knox, Ambari, Ranger, Oozie - TALEND, SSIS - MySQL, MS SQL Server, Oracle - Windows, Linux Being open to working in 2nd shifts (1pm - 10pm) is essential for this role. Your excellent English communication skills will be crucial for effective collaboration. If you are interested, please share your profile on mytestingcareer.com. When responding, kindly include your Current CTC, Expected CTC, Notice Period, Current Location, and Contact number.,

Posted 1 day ago

Apply

5.0 - 15.0 years

0 Lacs

India

Remote

Job Title: Snowflake Developer Job Duration: 5 months Job Location: Remote India months Job Summary: We are seeking a skilled and detail-oriented Snowflake Developer to design, develop, and maintain scalable data solutions using the Snowflake platform. The ideal candidate will have experience in data warehousing, ETL/ELT processes, and cloud-based data architecture. Key Responsibilities: Design and implement data pipelines using Snowflake, SQL, and ETL tools. • Develop and optimize complex SQL queries for data extraction and transformation. • Create and manage Snowflake objects such as databases, schemas, tables, views, and stored procedures. • Integrate Snowflake with various data sources and third-party tools. • Monitor and troubleshoot performance issues in Snowflake environments. • Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. • Ensure data quality, security, and governance standards are met. • Automate data workflows and implement best practices for data management. Required Skills and Qualifications: • Proficiency in Snowflake SQL and Snowflake architecture. • Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion). • Strong knowledge of cloud platforms (AWS, Azure, or GCP). • Familiarity with data modeling and data warehousing concepts. • Experience with Python, Java, or Shell scripting is a plus. • Understanding of data security, role-based access control, and data sharing in Snowflake. • Excellent problem-solving and communication skills. Preferred Qualifications: • Snowflake certification (e.g., SnowPro Core). • Experience with CI/CD pipelines and DevOps practices. • Knowledge of BI tools like Tableau, Power BI, or Looker. 5-15 years of experience is preferred. Experience with Agile based development Problem solving skills Proficiency in writing performant SQL Queries/Scripts to generate business insights and drive better organizational decision making.

Posted 1 day ago

Apply

58.0 years

0 Lacs

Delhi, India

On-site

Job Summary We are looking for a skilled Data Modeler / Architect with 58 years of experience in designing, implementing, and optimizing robust data architectures in the financial payments industry. The ideal candidate will have deep expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms such as Databricks or Snowflake. You will play a key role in designing scalable data models, orchestrating reliable data workflows, and ensuring the integrity and performance of mission-critical financial datasets. This is a highly collaborative role interfacing with engineering, analytics, product, and compliance teams. Key Responsibilities Design, implement, and maintain logical and physical data models to support transactional, analytical, and reporting systems. Develop and manage scalable ETL/ELT pipelines for processing large volumes of financial transaction data. Tune and optimize SQL queries, stored procedures, and data transformations for maximum performance. Build and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. Architect data lakes and warehouses using platforms like Databricks, Snowflake, BigQuery, or Redshift. Enforce and uphold data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). Collaborate closely with data engineers, analysts, and business stakeholders to understand data needs and deliver solutions. Conduct data profiling, validation, and quality assurance to ensure clean and consistent data. Maintain clear and comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications 58 years of experience as a Data Modeler, Data Architect, or Senior Data Engineer in the financial/payments domain. Advanced SQL expertise, including query tuning, indexing, and performance optimization. Proficiency in developing ETL/ELT workflows using tools such as Spark, dbt, Talend, or Informatica. Experience with data orchestration frameworks: Airflow, Dagster, Luigi, etc. Strong hands-on experience with cloud-based data platforms like Databricks, Snowflake, or equivalents. Deep understanding of data warehousing principles: star/snowflake schema, slowly changing dimensions, etc. Familiarity with financial data structures, such as payment transactions, reconciliation, fraud patterns, and audit trails. Working knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. Strong analytical thinking and problem-solving capabilities in high-scale environments. Preferred Qualifications Experience with real-time data pipelines (e.g., Kafka, Spark Streaming). Exposure to data mesh or data fabric architecture paradigms. Certifications in Snowflake, Databricks, or relevant cloud platforms. Knowledge of Python or Scala for data engineering tasks (ref:hirist.tech)

Posted 1 day ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

Remote

Java Back End Engineer with AWS Location : Remote Experience : 10+ Years Employment Type : Full-Time Job Overview We are looking for a highly skilled Java Back End Engineer with strong AWS cloud experience to design and implement scalable backend systems and APIs. You will work closely with cross-functional teams to develop robust microservices, optimize database performance, and contribute across the tech stack, including infrastructure automation. Core Responsibilities Design, develop, and deploy scalable microservices using Java, J2EE, Spring, and Spring Boot. Build and maintain secure, high-performance APIs and backend services on AWS or GCP. Use JUnit and Mockito to ensure test-driven development and maintain code quality. Develop and manage ETL workflows using tools like Pentaho, Talend, or Apache NiFi. Create High-Level Design (HLD) and architecture documentation for system components. Collaborate with cross-functional teams (DevOps, Frontend, QA) as a full-stack contributor when needed. Tune SQL queries and manage performance on MySQL and Amazon Redshift. Troubleshoot and optimize microservices for performance and scalability. Use Git for source control and participate in code reviews and architectural discussions. Automate infrastructure provisioning and CI/CD processes using Terraform, Bash, and pipelines. Primary Skills Languages & Frameworks : Java (v8/17/21), Spring Boot, J2EE, Servlets, JSP, JDBC, Struts Architecture : Microservices, REST APIs Cloud Platforms : AWS (EC2, S3, Lambda, RDS, CloudFormation, SQS, SNS) or GCP Databases : MySQL, Redshift Secondary Skills (Good To Have) Infrastructure as Code (IaC) : Terraform Additional Languages : Python, Node.js Frontend Frameworks : React, Angular, JavaScript ETL Tools : Pentaho, Talend, Apache NiFi (or equivalent) CI/CD & Containers : Jenkins, GitHub Actions, Docker, Kubernetes Monitoring/Logging : AWS CloudWatch, DataDog Scripting : Bash, Shell scripting Nice To Have Familiarity with agile software development practices Experience in a cross-functional engineering environment Exposure to DevOps culture and tools (ref:hirist.tech)

Posted 1 day ago

Apply

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a T24 BA_Data Migration - Senior Manager, you will lead the end-to-end analysis and execution of data migration initiatives across complex enterprise systems. This role demands deep expertise in data migration strategies, strong analytical capabilities, and a proven ability to work with cross-functional teams, including IT, business stakeholders, and data architects. You will be responsible for defining migration requirements, leading data mapping and reconciliation efforts, ensuring data integrity, and supporting transformation programs from legacy systems to modern platforms. As a senior leader, you will also play a critical role in stakeholder engagement, risk mitigation, and aligning data migration efforts with broader business objectives. The ideal candidate should be well versed in Technical aspects of the product and experienced in Data Migration activities. They should have a good understanding of the T24 architecture, administration, configuration, and data structure. Additionally, the candidate should have design and development experience in Infobasic, Core Java, EJB, and J2EE Enterprise, as well as working experience and/or knowledge of INFORMATICA. In-depth experience in End-to-End Migration tasks, right from Migration strategy, ETL process, and data reconciliation is required. Experience in relational or hierarchical databases including Oracle, DB2, Postgres, MySQL, and MSSQL is a must. Other mandatory requirements include the willingness to work out of the client location in Chennai for 5 days a week. The candidate should possess an MBA/MCA/BE/B.Tech or equivalent with a sound industry experience of 9 to 12 years. Your client responsibilities will involve working as a team lead in one or more T24 projects, interface and communicate with onsite coordinators, completion of assigned tasks on time, regular status reporting to the lead and manager, and interface with customer representatives as needed. You should be ready to travel to customer locations on a need basis. Your people responsibilities will include building a quality culture, managing the performance management for direct reportees, fostering teamwork, leading by example, training and mentoring project resources, and participating in the organization-wide people initiatives. Preferred skills include database administration, performance tuning, and prior client-facing experience. EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be a seasoned Senior ETL/DB Tester with expertise in data validation and database testing across modern data platforms. Your role will involve designing, developing, and executing comprehensive test plans for ETL and database validation processes. You will validate data transformations and integrity across multiple stages and systems such as Talend, ADF, Snowflake, and Power BI. Your responsibilities will include performing manual testing and defect tracking using tools like Zephyr or Tosca. You will analyze business and data requirements to ensure full test coverage, and write and execute complex SQL queries for data reconciliation. Identifying data-related issues and conducting root cause analysis in collaboration with developers will be crucial aspects of your role. You will track and manage bugs and enhancements through appropriate tools, and optimize testing strategies for performance, scalability, and accuracy in ETL processes. Your skills should include proficiency in ETL Tools such as Talend and ADF, working on Data Platforms like Snowflake, and experience in Reporting/Analytics tools like Power BI and VPI. Additionally, you should have expertise in Testing Tools like Zephyr or Tosca, manual testing, and strong SQL skills for validating complex data. Moreover, exposure to API Testing and familiarity with advanced features of Power BI such as Dashboards, DAX, and Data Modeling will be beneficial for this role.,

Posted 1 day ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The incumbent will be part of Global-BI/Data Engineering Team and responsible for developing and maintaining solutions on Talend and working closely with team of developers, QAs, and Support Analysts. This individual, functionally reporting to Team Leader, Global-BI, is an active partner and a business process visionary who shapes the technology demand among customer facing employee community across global businesses. The role will need to have a broad range of technical and functional knowledge, with good blend of problem solving and communication skills; able to align regional requirements with global templates and deliver solution across multiple projects. This role will also provide consultative leadership to both technical and functional resources across the organization, from strategic decision making to project planning to overall governance and oversight. Engage key business and technology stakeholders across the enterprise; be highly collaborative, drive communication to business, delivery and support teams, and apply analytical and problem-solving skills to drive solutions with industry best practices. Experience 4+ years relevant technical experience with 3+ years in Java. Experience with Talend Data Integration: Design, develop Talend ETL scripts, creating and deploying end to end Talend Data Integration solution. Experience in Talend Studio and Talend Cloud. Good knowledge with RBDMS and SQL scripting. Good knowledge in Snowflake, Google cloud Platform (GCP) services, Git and Python. Automation, orchestrations and Performance Tuning of ETL processes along with implementing Best Practices. Experience with development and production support. Experience in designing, developing, validating and deploying the Talend ETL Pipelines. Non-Technical Requirements Proven success in contributing to a team-oriented environment. Proven ability to work creatively and analytically in a problem-solving environment. Excellent communication (written and oral) and interpersonal skills. Note: Work schedule from Thursday to Monday, 11:00 AM to 8:00 PM IST, with Tuesday and Wednesday as their off days.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Responsibilities: Develop and maintain data pipelines using Snowflake and Talend Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions. Design and implement data transformation logic using SQL, Python, or other programming languages. Optimize data pipelines for performance and scalability. Implement data security and governance best practices. Troubleshoot and resolve issues related to data pipelines and processing. Document data pipelines, processes, and best practices. Qualifications: Strong experience with Snowflake's cloud data platform, including data modeling, performance tuning, and security. Experience with SQL and other programming languages for data manipulation and analysis. Familiarity with cloud computing concepts and services, particularly Azure. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Knowledge on Talend will be added advantage. Experience: Up to 5 years of experience in data engineering, ETL development, or a related field and at-least 2-3 years of experience in Snowflake. Experience working with data warehousing and data integration projects. Experience in Talend Snow or Azure Certification will be an added advantage.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 1 day ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 1 day ago

Apply

10.0 years

5 - 6 Lacs

Hyderābād

On-site

Job Information Date Opened 07/29/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Lead Cloud Data Engineer / Technical Architect Experience: 10+ Years Location - Hyderabad Job Summary: We are seeking a highly skilled and experienced Cloud Data Engineer with a strong foundation in AWS, data warehousing, and application migration. The ideal candidate will be responsible for designing and maintaining cloud-based data solutions, leading teams, collaborating with clients, and ensuring smooth migration of on-premises applications to the cloud. Key Responsibilities: Engage directly with clients to understand requirements, provide solution design, and drive successful project delivery. Lead cloud migration initiatives, specifically moving on-premise applications and databases to AWS cloud platforms. Design, develop, and maintain scalable, reliable, and secure data applications in a cloud environment. Lead and mentor a team of engineers; oversee task distribution, progress tracking, and issue resolution. Develop, optimize, and troubleshoot complex SQL queries and stored procedures. Design and implement robust ETL pipelines using tools such as Talend , Informatica , or DataStage . Ensure optimal usage and performance of Amazon Redshift and implement performance tuning strategies. Collaborate across teams to implement best practices in cloud architecture and data management. Requirements Required Skills and Qualifications: Strong hands-on experience with the AWS ecosystem , including services related to storage, compute, and data analytics. In-depth knowledge of data warehouse architecture and best practices. Proven experience in on-prem to cloud migration projects . Expertise in at least one ETL tool : Talend, Informatica, or DataStage . Strong command of SQL and Stored Procedures . Practical knowledge and usage of Amazon Redshift . Demonstrated experience in leading teams and managing project deliverables. Strong understanding of performance tuning for data pipelines and databases. Good to Have: Working knowledge or hands-on experience with Snowflake . Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Benefits As per company standards.

Posted 1 day ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Hyderābād

On-site

Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Data Analyst Position: Senior Software Engineer - Data Analyst Experience: 3 to 6 Years Main location: India, Telangana, Hyderabad Position ID: J0525-1616 Shift: General Shift (5 Days WFO for initial 8 weeks) Employment Type: Full Time Your future duties and responsibilities Design, develop, and optimize complex SQL queries for data extraction, transformation, and loading (ETL). Work with Teradata databases to perform high-volume data analysis and support enterprise-level reporting needs. Understand business and technical requirements to create and manage Source to Target Mapping (STM) documentation. Collaborate with business analysts and domain SMEs to map banking-specific data such as transactions, accounts, customers, products, and regulatory data. Analyze large data sets to identify trends, data quality issues, and actionable insights. Participate in data migration, data lineage, and reconciliation processes. Ensure data governance, quality, and security protocols are followed. Support testing and validation efforts during system upgrades or new feature implementations. Required qualifications to be successful in this role Advanced SQL – Joins, subqueries, window functions, performance tuning. Teradata – Query optimization, utilities (e.g., BTEQ, FastLoad, MultiLoad), DDL/DML. Experience with ETL tools (e.g., Informatica, Talend, or custom SQL-based ETL pipelines). Hands-on in preparing STM (Source to Target Mapping) documents. Familiarity with data modeling and data warehouse concepts (star/snowflake schema). Proficient in Excel and/or BI tools (Power BI, Tableau, etc.) for data visualization and analysis. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

6.0 years

15 - 18 Lacs

Indore

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 1 day ago

Apply

0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Job Description Position: Data Engineer Intern Location: Remote Duration: 2-6 months Company: Collegepur Type: Unpaid Internship About the Internship: We are seeking a skilled Data Engineer to join our team, with a focus on cloud data storage, ETL processes, and database/data warehouse management. If you are passionate about building robust data solutions and enabling data-driven decision-making, we want to hear from you! Key Responsibilities: 1. Design, develop, and maintain scalable data pipelines to process large datasets from multiple sources, both structured and unstructured. 2. Implement and optimize ETL (Extract, Transform, Load) processes to integrate, clean, and transform data for analytical use. 3. Manage and enhance cloud-based data storage solutions, including data lakes and data warehouses, using platforms such as AWS, Azure, or Google Cloud. 4. Ensure data security, privacy, and compliance with relevant standards and regulations. 5. Collaborate with data scientists, analysts, and software engineers to support data-driven projects and business processes. 6. Monitor and troubleshoot data pipelines to ensure efficient, real-time, and batch data processing. 7. Maintain comprehensive documentation and data mapping across multiple systems. Requirements: 1. Proven experience with cloud platforms (AWS, Azure, or Google Cloud). 2. Strong knowledge of database systems, data warehousing, and data modeling. 3. Proficiency in programming languages such as Python, Java, or Scala. 4. Experience with ETL tools and frameworks (e.g., Airflow, Informatica, Talend). 5. Familiarity with data security, compliance, and governance practices. 6. Excellent analytical, problem-solving, and communication skills. 7. Bachelor’s degree in Computer Science, Information Technology, or related field.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies