Jobs
Interviews

1529 Talend Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement adjustments as necessary to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Good To Have Skills: Experience with data integration tools and methodologies.- Strong understanding of data warehousing concepts and practices.- Experience in performance tuning and optimization of ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Tagetik Planning Budgeting and Forecasting Good to have skills : NA Educational Qualification : 15 years full time education is required Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications function seamlessly to support organizational goals. You will also participate in testing and troubleshooting to enhance application performance and user experience, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Tagetik Planning Budgeting and Forecasting.- Strong analytical skills to interpret data and provide actionable insights.- Experience in application development methodologies and best practices.- Ability to work collaboratively in a team-oriented environment.- Familiarity with project management tools and techniques. Additional Information:- The candidate should have minimum 3 years of experience in Tagetik Planning Budgeting and Forecasting.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education is required

Posted 4 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require you to balance technical oversight with team management, fostering an environment of collaboration and innovation. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor project progress and ensure timely delivery of application components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills

Posted 4 weeks ago

Apply

7.0 - 10.0 years

15 - 17 Lacs

Navi Mumbai, Mumbai (All Areas)

Work from Office

Greetings!!! This is in regards to a Job opportunity for ETL Developer with Datamatics Global Services Ltd. Position: ETL Developer Website: https://www.datamatics.com/ Job Location: Mumbai ****Contract for 3 months**** Job Description: 5 years experience Minimum 3 years of experience in Talend & Datastage development Expertise in designing and implementing Talend & Datastage ETL jobs Strong analytical and problem-solving skills Design, develop, and maintain Talend integration solutions Collaborate with business stakeholders and IT teams to gather requirements and recommend solutions Create and maintain technical documentation Perform unit testing and troubleshoot issues

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education

Posted 4 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Requirements Role/ Job Title: Senior Data Analyst-Data Governance Business: Data & Analytics Function/ Department: Data & Analytics Place of Work: Mumbai Job Purpose Senior data Analyst (DG) will work within Data & Analytics Office to implement data governance framework with a focus on improvement of data quality, standards, metrics, processes. Align data management practices with regulatory requirements. Understanding of lineage – How the data is produced, managed and consumed within the Banks business process and system. Roles & Responsibilities Demonstrate Strong understanding of data governance, data quality, data lineage and metadata management concepts Participate in the data quality governance framework design and optimization, including process, standards, rules etc. Design and implement data quality rules and Monitoring mechanism Analyze data quality issue and collaborate with business stakeholders to address the issue resolution, Build recovery model across Enterprise. knowledge of DG technologies for data quality and metadata management (Ovaledge, Talend, Collibra etc) Support in development of Centralized Metadata repositories (Business glossary, technical metadata etc), Captures business/Data quality rules and design DQ reports & Dashboards Improve data literacy among the stakeholders. Minimum 6 to 12 years of experience in Data governance with Banking Domain preferable. Key Success Metrics Successful implementation of DQ framework across business line Successful manage Metadata management program.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Chennai

Work from Office

Sr. ETL Developer 5-10 yrs Client : US (1-10pm) Chennai/Madurai (Hybrid) Third-party payroll: Smiligence Skills : Talend, Informatica, SSIS, PostgreSQL, AWS (S3, Glue, RDS, Redshift), Linux, Shell/Python, Airflow, Git, Quilt, SSIS, NiFi, Databricks

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

Requisition ID: 8157 Bangalore, India Enphase Energy is a global energy technology company and leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, Enphase transformed the solar industry with our revolutionary microinverter technology, which turns sunlight into a safe, reliable, resilient, and scalable source of energy to power our lives. Today, the Enphase Energy System helps people make, use, save, and sell their own power. Enphase is also one of the fastest growing and innovative clean energy companies in the world, with approximately 68 million products installed across more than 145 countries. We are building teams that are designing, developing, and manufacturing next-generation energy technologies and our work environment is fast-paced, fun and full of exciting new projects. If you are passionate about advancing a more sustainable future, this is the perfect time to join Enphase! About the role: The Enphase ‘Analyst – Procurement’ will get involved in Claims Process, Component Capacity and Inventory Analysis, Supplier Risk Assessments and Other Procurement related Analytics. This role is to understand existing process in detail and implement RPA model wherever it is applicable. Perform market research on latest process and procedures available with respect to procurement function and automate/Digitize the process. A highly Challenging Job role where you need to Interact with many stake holders and to solve operational issues. You will be part of the Global Sourcing & Procurement team reporting to Lead Analyst. What you will do: Perform detailed analysis on Component Inventory against the Demand, On-Hand & Open Order Qtys: Use advanced data analytics tools like Power BI or Tableau to visualize inventory data. Implement predictive analytics to forecast demand more accurately. Automate the input data consolidation from different Contract Manufacturers: Use ETL (Extract, Transform, Load) tools like Alteryx or Talend to automate data consolidation. Implement APIs to directly pull data from manufacturers' systems. Prepare and submit a monthly STD cost file to finance as per the corporate calendar timelines: Create a standardized template and automate data entry using Excel macros or Python scripts. Set up reminders and workflows in project management tool to ensure timely submission. Work as a program manager by driving component Qualification process working with cross functional teams to get the Qualification completed on time to achieve planned cost savings: Use project management software like Jira to track progress and deadlines. Regularly hold cross-functional team meetings to ensure alignment and address any roadblocks. Finalize the quarterly CBOM (Costed Bill of Materials) and Quote files from all contract manufacturers by following the CBOM calendar timelines: Implement a centralized database to store and manage CBOM data. Use version control systems to track changes and ensure accuracy. Managing Claims management process with Contract Manufacturers and Suppliers: Develop a standardized claims validation process and effectively track & manage claims. Regularly review and update the claims process to improve efficiency. Do market research on new processes & Best Practices on procurement and see how it can be leveraged in the existing process Perform and maintain detailed analysis on Supplier risk assessment with the help of 3rd party vendors: Regularly review and update risk assessment criteria based on changing market conditions. Compile and perform Supplier pricing trend analysis to support Commodity Managers for their QBRs: Create dashboards in BI tools to visualize pricing trends and support decision-making. Work closely with Commodity Managers and identify the Potential or NPI Suppliers to be evaluated for risk assessments: Maintain a database of potential suppliers and their risk assessment results. Maintain & Manage Item master pricing list by refreshing the data on regular intervals without any errors: Use data validation techniques and automated scripts to ensure data accuracy. Implement a regular review process to update and verify pricing data. Who you are and what you bring: Any Bachelor's degree, preferred in Engineering, with minimum 5+ years of experience in Supply Chain Analytics. Should have very good Analytical & Problem-Solving skills. Should have hands on experience on excel based Automations, using MS Power Query, Excel VBA & Gen AI. Should be open minded and should take ownership. Should have strong Verbal Communication and Presentation skills. Strong professional relationship management with internal and external interfaces. Strong interpersonal skills with proven ability to communicate effectively both verbally and in writing with internal customers and suppliers. Ability to perform effectively and independently in a virtual environment. Ability to effectively manage job responsibilities with minimal supervision

Posted 4 weeks ago

Apply

0 years

3 - 9 Lacs

Bengaluru

On-site

Requisition ID: 22562 Job Category: Engineering & Technology Career level: Specialist Contract type: Permanent Location: Bengaluru, IN SKF has been around for more than a century and today we are one of the world’s largest global suppliers of bearings and supporting solutions for rotating equipment. Our products can be found literally everywhere in society. This means that we are an important part of the everyday lives of people and companies around the world. In September of 2024, SKF announced the separation of its Automotive business, with the objective to build two world-leading businesses. The role you are applying for will be part of the automotive business. This means you will have the opportunity to be a part of shaping a new company aimed at meeting the needs of the transforming global automotive market. Would you like to join us in shaping the future of motion? We are now looking for a … Data Engineer, India – Automobile Business Design, build, and maintain the data infrastructure and systems that support SKF VA data needs. By leveraging their skills in data modeling, data integration, data processing, data storage, data retrieval, and performance optimization, this role can help VA manage and utilize their data more effectively. Key responsibilities (or What you can expect in the role) Build an VA data warehouse which is scalable, secured, and compliant using snowflake technologies. This would include designing and developing Snowflake data models Work with Central data warehouse like SDW, MDW, OIDW to extract data and enrich with VA specific customer grouping, program details etc. Data integration: Responsible for integrating data from ERP’s, BPC and other systems into Snowflake, SKF standard DW’s ensuring that data is accurate, complete, and consistent. Performance optimization: Responsible for optimizing the performance of Snowflake queries and data loading processes. Involves optimizing SQL queries, creating indexes, and tuning data loading processes. Security and access management: Responsible for managing the security and access controls of the Snowflake environment. This includes configuring user roles and permissions, managing encryption keys, and monitoring access logs. Maintain existing databases, warehouse solutions addressing support needs, enhancements Troubleshooting etc. Metrics Technical metrics: Data quality for whole of VA BU, data processing time, data storage capacity and systems availability Business metrics: data driven decision making, data security and compliance, cross functional collaboration. Competencies Should have a good understanding of data modeling concepts and should be familiar with Snowflake's data modeling tools and techniques. SQL: Should be expert in SQL. Should be able to write complex SQL queries and understand how to optimize SQL performance in Snowflake. Pipeline Management & ETL: Should be able to design and manage data pipelines on Snowflake and Azure, using ETL/ELT tools (e.g., DBT, Alteryx, Talend, Informatica). Should have a good understanding of cloud computing concepts and be familiar with the cloud infrastructure on which Snowflake operates. Good understanding of data warehousing concepts and be familiar with Snowflake's data warehousing tools and techniques Familiar with data governance and security concepts Able to identify and troubleshoot issues with Snowflake and SKF’s data infrastructure Experience with Agile solution development Good to have – knowledge on SKF ERP systems (XA, SAP, PIM etc.), data related sales, supply chain data, manufacturing. Candidate Profile: Bachelor’s degree in computer science, Information technology or a related field SKF is committed to creating a diverse environment, and we firmly believe that a diverse workforce is essential for our continued success. Therefore, we only focus on your experience, skills, and potential. Come as you are – just be yourself. #weareSKF Some additional information This position will be located in Bangalore. For questions regarding the recruitment process, please contact Anuradha Seereddy, Recruitment Specialist, on email anuradha.seereddy@skf.com.

Posted 4 weeks ago

Apply

0 years

7 Lacs

Ahmedabad

On-site

Job Summary: We are seeking a detail-oriented and analytical Data Analyst to join our team. The ideal candidate will be responsible for interpreting data, analyzing results using statistical techniques, and providing ongoing reports to support business decisions. This role requires a strong foundation in data analysis, statistical modeling, and visualization, along with a passion for turning data into actionable insights. Key Responsibilities: Collect, process, and clean data from various sources to ensure data quality and integrity. Perform data analysis and statistical evaluations to identify trends, patterns, and insights. Create compelling data visualizations and dashboards using tools such as Power BI, Tableau, or similar. Write clean and efficient code to manipulate large datasets (using Python, R, SQL, etc.). Collaborate with cross-functional teams to define and track key performance indicators (KPIs). Document data analysis processes and communicate findings with stakeholders. Work closely with Data Engineering teams to support data pipeline development and optimization. Must-Have Skills: Strong proficiency in data analysis and statistical methods Hands-on experience with data visualization tools (e.g., Power BI, Tableau, Matplotlib, Seaborn) Proficient in programming languages such as Python, R, or SQL Understanding of data engineering basics and working with structured/unstructured data Ability to interpret data and present it clearly to both technical and non-technical audiences Good-to-Have Skills: Experience with advanced programming concepts and scripting Familiarity with ETL processes and tools like Apache NiFi, Talend, or Informatica Knowledge of machine learning fundamentals and predictive analytics Exposure to advanced statistical modeling techniques Experience with cloud platforms (AWS, GCP, Azure) for data handling Educational Qualifications: Bachelor’s or Master’s degree in Statistics, Mathematics, Computer Science, Data Science , or a related field Job Type: Full-time Pay: From ₹700,000.00 per year Benefits: Health insurance Schedule: Day shift Application Question(s): How many years of experience Data Analyst? How many years experience do you have in Power Bi? What is your current salary? Work Location: In person

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Product Manager – Healthcare Data & Analytics About the Role As Product Manager, you will lead the strategy, execution, and commercialization of innovative data and analytics products for the U.S. healthcare market. This is a highly collaborative role where you'll work cross-functionally with Engineering, Sales, Design, and Delivery teams to build scalable, interoperable solutions that address core challenges across payers and providers. You’ll be responsible for partnering with the solution offering manager to deliver on the product vision and roadmap, conducting customer discovery, tracking success metrics, and ensuring timely delivery of high-impact features. This role carries revenue responsibilities and is key to EXL Health’s broader growth agenda. Core Responsibilities Product Strategy & Leadership Develop and own the quarterly roadmap for healthcare data and analytics solution Manage product backlog and ensure alignment with evolving client needs, compliance mandates (e.g., CMS, FHIR), and company objectives Translate customer pain points and regulatory changes into innovative data-driven products and services. Champion a customer-first approach while ensuring technical feasibility and commercial viability. Stay ahead of technology and market trends—especially in AI, value-based care, and Care Management Collaborate closely with Engineering and Design teams to define and prioritize product requirements. Client Engagement & Sales Support Meet directly with clients to shape strategy, gather feedback, and build trusted relationships. Serve as the bridge between client expectations and solution capabilities, ensuring alignment and delivery excellence. Qualifications Experience Minimum 5–8 years of experience in analytics, data platforms, or product management, preferably within the U.S. healthcare ecosystem. At least 3 years in a leadership or client-facing product role, including experience managing end-to-end product development and revenue accountability. Proven success in bringing data or analytics products to market—from ideation through launch and iteration. Healthcare Domain Expertise Deep familiarity with U.S. payer or provider environments, including claims, payments, risk adjustment, population health, or care management. Working knowledge of regulatory and interoperability standards (e.g., CMS 0057, FHIR, TEFCA). Hands-on understanding of how data management, analytics, and AI/ML drive value in clinical or operational workflows. Technical Skills Practical experience with or exposure to: Cloud Platforms: Snowflake, AWS, Azure, GCP BI & Visualization Tools: Tableau, Power BI, Qlik ETL/Data Integration: Informatica, Talend, SSIS, Erwin Data Science/AI/ML: Experience collaborating data science teams on AI initiatives Agile/Tools: Jira, Confluence, Asana, Agile/Scrum methodologies Personal Attributes Strategic thinker who can dive deep into execution. Exceptional written and verbal communication, with the ability to translate technical concepts for non-technical audiences. Strong organizational, problem-solving, and analytical skills. Passion for innovation, continuous improvement, and cross-functional collaboration. A team-first leader with high emotional intelligence and the ability to mentor others. Education Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, Statistics, Business, or a related field from a top-tier institution.

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Job Description We are looking for an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of experience in data integration, warehousing, and analytics. The ideal candidate will have deep technical expertise in ETL tools, strong data modeling knowledge, and the ability to lead complex data engineering projects from design to deployment. Key Skills 4+ years of hands-on experience with ETL tools like SSIS, Informatica, DataStage, or Talend. Proficient in relational databases such as SQL Server and MySQL. Strong understanding of Data Mart/EDW methodologies. Experience in designing star schemas, snowflake schemas, fact and dimension tables. Experience with Snowflake or BigQuery. Knowledge of reporting and analytics tools like Tableau and Power BI. Scripting and programming proficiency using Python. Familiarity with cloud platforms such as AWS or Azure. Ability to lead recruitment, estimation, and project execution. Exposure to Sales and Marketing data domains. Experience with cross-functional and geographically distributed teams. Ability to translate complex data problems into actionable insights. Strong communication and client management skills. Self-starter with a collaborative attitude and problem-solving mindset. Roles & Responsibilities Deliver high-level and low-level design documents for middleware and ETL architecture. Design and review data integration components, ensuring adherence to standards and best practices. Own delivery quality and timeliness across one or more complex projects. Provide functional and non-functional assessments for global data implementations. Offer technical problem-solving guidance and support to junior team members. Drive QA for deliverables and validate progress against project timelines. Lead issue escalation, status tracking, and continuous improvement initiatives. Support planning, estimation, and resourcing across data engineering efforts.

Posted 4 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Notice Period: Immediate About the role: We are hiring a Senior Snowflake Data Engineer with 10+ years of experience in cloud data warehousing and deep expertise on the Snowflake platform. The ideal candidate should have strong skills in SQL, ETL/ELT, data modeling, and performance tuning, along with a solid understanding of Snowflake architecture, security, and cost optimization. Roles & Responsibilities: Collaborate with data engineers, product owners, and QA teams to translate business needs into efficient Snowflake-based data models and pipelines. Design, build, and optimize data solutions leveraging Snowflake features such as virtual warehouses, data sharing, cloning, and time travel. Develop and maintain robust ETL/ELT pipelines using tools like Talend, Snowpipe, Streams, Tasks, and Python. Ensure optimal performance of SQL queries, warehouse sizing, and cost-efficient design strategies. Implement best practices for data quality, security, and governance, including RBAC, network policies, and masking. Contribute to code reviews and development standards to ensure high-quality deliverables. Support analytics and BI teams with data exploration and visualization using tools like Tableau or Power BI. Maintain version control using Git and follow Agile development practices. Required Skills: Snowflake Expertise: Deep knowledge of Snowflake architecture and core features. SQL Development: Advanced proficiency in writing and optimizing complex SQL queries. ETL/ELT: Hands-on experience with ETL/ELT design using Snowflake tools and scripting (Python). Data Modeling: Proficient in dimensional modeling, data vault, and best practices within Snowflake. Automation & Scripting: Python or similar scripting language for data workflows. Cloud Integration: Familiarity with Azure and its services integrated with Snowflake. BI & Visualization: Exposure to Tableau, Power BI or other similar platforms.

Posted 4 weeks ago

Apply

0 years

0 Lacs

India

Remote

Data Integration Lead – remote working Data Integration Lead required to join a large harmonization program for a leading end user client. Lead a team of 3 across various Data Integration tools – Talend, Informatica, ADF. Remote working, offshore location, 6 month rolling contract, excellent rates. Key experience required: Proven experience leading a team in Data & Analytics roles Experience with multiple data integration tools – Talend, Informatics, Azure Data Factory Experience with ETL / ELT architecture and cloud (azure, AWS) SQL, API, data modelling Experience with BI tools

Posted 4 weeks ago

Apply

4.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: ETL Test Engineer Experience range: 4-10 years Location: Hyderabad/Bangalore/Chennai Job description: NOTE: Relevant experience in ETL Testing and SQL Experience is mandatory 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project

Posted 4 weeks ago

Apply

1.5 - 3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Engineer – Production Support Location: Chennai Experience: 1.5 to 3 Years Job Type: Full-time | Work from Office Job Summary: We are seeking a motivated and detail-oriented Data Engineer - Production Support to join our team in Chennai . The ideal candidate will have 1.5 to 3 years of experience in managing and supporting data pipelines, troubleshooting production issues, and ensuring seamless data flow across systems. Key Responsibilities: Monitor, support, and maintain data pipelines and workflows in production. Troubleshoot and resolve data issues, failures, and job errors proactively. Perform root cause analysis for production issues and recommend long-term solutions. Collaborate with data engineering and business teams for efficient data operations. Optimize performance of ETL jobs and improve reliability and scalability. Perform daily health checks and validation of data pipelines. Ensure timely availability and accuracy of business-critical data. Required Skills: 1.5 to 3 years of experience in Data Engineering - Production Support roles. Strong understanding of SQL and relational databases (MySQL, PostgreSQL, etc.). Hands-on experience with ETL tools (Informatica, Talend, Apache Nifi, or similar). Experience in Python or Shell scripting for automation. Good understanding of data warehouse concepts and data flows. Exposure to cloud platforms (AWS, GCP, or Azure) is a plus. Familiarity with job schedulers (Airflow, Control-M, Autosys, etc.) is an advantage. Strong analytical and problem-solving skills. Nice to Have: Exposure to Big Data tools (Hadoop, Spark, Hive). Experience with monitoring tools (Grafana, Prometheus, etc.). Basic knowledge of DevOps tools (Git, Jenkins, etc.). Please share your updated resumes to recruiting@talentwavesystems.com

Posted 4 weeks ago

Apply

0.0 - 8.0 years

0 Lacs

Mohali, Punjab

On-site

Description Job Description We are looking for an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of experience in data integration, warehousing, and analytics. The ideal candidate will have deep technical expertise in ETL tools, strong data modeling knowledge, and the ability to lead complex data engineering projects from design to deployment. Skills Key Skills 4+ years of hands-on experience with ETL tools like SSIS, Informatica, DataStage, or Talend. Proficient in relational databases such as SQL Server and MySQL. Strong understanding of Data Mart/EDW methodologies. Experience in designing star schemas, snowflake schemas, fact and dimension tables. Experience with Snowflake or BigQuery. Knowledge of reporting and analytics tools like Tableau and Power BI. Scripting and programming proficiency using Python. Familiarity with cloud platforms such as AWS or Azure. Ability to lead recruitment, estimation, and project execution. Exposure to Sales and Marketing data domains. Experience with cross-functional and geographically distributed teams. Ability to translate complex data problems into actionable insights. Strong communication and client management skills. Self-starter with a collaborative attitude and problem-solving mindset. Responsibilities Roles & Responsibilities Deliver high-level and low-level design documents for middleware and ETL architecture. Design and review data integration components, ensuring adherence to standards and best practices. Own delivery quality and timeliness across one or more complex projects. Provide functional and non-functional assessments for global data implementations. Offer technical problem-solving guidance and support to junior team members. Drive QA for deliverables and validate progress against project timelines. Lead issue escalation, status tracking, and continuous improvement initiatives. Support planning, estimation, and resourcing across data engineering efforts. Contacts Email: careers@grazitti.com Address: Grazitti Interactive LLP (SEZ Unit), 2nd floor, Quark City SEZ, A-40A, Phase VIII Extn., Mohali, SAS Nagar, Punjab, 160059, Mohali, Punjab, India

Posted 4 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderābād

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ͏ 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ͏ 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ͏ Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Talend DI. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 4 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Designation : Data Architect. Location : Pune. Experience : 10-15 years. Job Description Role & Responsibilities : The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Assist with creation of a robust, sustainable architecture that supports requirements and provides for expansion with secured access. Experience in building/running large data environment for BFSI clients. Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic solutions. Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort. Communicates physical database designs to lead data architect/database administrator. Evolves data models to meet new and changing business requirements. Work with business analysts to identify and understand requirements and source data systems. Skills Required Big Data Technologies : Expert in big data technologies on Azure/GCP. ETL Platforms : Experience with ETL platforms like ADF, Glue, Ab Initio, Informatica, Talend, Airflow. Data Visualization : Experience in data visualization tools like Tableau, Power BI, etc. Data Engineering & Management : Experience in a data engineering, metadata management, database modeling and development role. Streaming Data Handling : Strong experience in handling streaming data with Kafka. Data API Understanding : Understanding of Data APIs, Web services. Data Security : Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. DataOps/MLOps : Experience in setting up DataOps and MLOps. Integration : Work with other architects to ensure that all components work together to meet objectives and performance goals as defined in the requirements. Data Science Coordination : Coordinate with the Data Science Teams to identify future data needs and requirements and creating pipelines for them. Soft Skills Soft skills such as communication, leading the team, taking ownership and accountability to successful engagement. Participate in quality management reviews. Managing customer expectation and business user interactions. Deliver key research (MVP, POC) with an efficient turn-around time to help make strong product decisions. Demonstrate key understanding and expertise on modern technologies, architecture, and design. Mentor the team to deliver modular, scalable, and high-performance code. Innovation : Be a change agent on key innovation and research to keep the product, team at the cutting edge of technical and product innovation. (ref:hirist.tech)

Posted 1 month ago

Apply

5.0 years

10 - 30 Lacs

Chennai, Tamil Nadu, India

On-site

Industry & Sector A fast-growing IT services and data analytics consultancy serving Fortune 500 clients across banking, retail and healthcare relies on high-quality data pipelines to fuel BI and AI initiatives. We are hiring an on-site ETL Test Engineer to ensure the reliability, accuracy and performance of these mission-critical workloads. Role & Responsibilities Develop and maintain end-to-end test plans for data extraction, transformation and loading processes across multiple databases and cloud platforms. Create reusable SQL queries and automation scripts to validate data completeness, integrity and historical load accuracy at scale. Set up data-comparison, reconciliation and performance tests within Azure DevOps or Jenkins CI pipelines for nightly builds. Collaborate with data engineers to debug mapping issues, optimise jobs and drive defect resolution through Jira. Document test artefacts, traceability matrices and sign-off reports to support regulatory and audit requirements. Champion best practices for ETL quality, mentoring junior testers on automation frameworks and agile rituals. Skills & Qualifications Must-Have 5 years specialised in ETL testing within enterprise data warehouse or lakehouse environments. Hands-on proficiency in advanced SQL, joins, window functions and data profiling. Experience testing Informatica PowerCenter, SSIS, Talend or similar ETL tools. Exposure to big data ecosystems such as Hadoop or Spark and cloud warehouses like Snowflake or Redshift. Automation skills using Python or Shell with Selenium or pytest frameworks integrated into CI/CD. Strong defect management and stakeholder communication skills. Preferred Knowledge of BI visualisation layer testing with Tableau or Power BI. Performance benchmarking for batch loads and CDC streams. ISTQB or equivalent testing certification. Benefits & Culture Highlights Work on high-impact data programmes for global brands using modern cloud technologies. Clear career ladder with sponsored certifications and internal hackathons. Collaborative, merit-driven culture that values innovation and continuous learning. Location: On-site, India. Candidates must be willing to work from client premises and collaborate closely with cross-functional teams. Skills: sql,shell,informatica powercenter,defect tracking,stakeholder communication,agile,pytest,redshift,ssis,defect management,snowflake,selenium,talend,python,etl testing,test automation,spark,hadoop,data warehousing

Posted 1 month ago

Apply

5.0 years

10 - 30 Lacs

Bengaluru, Karnataka, India

On-site

Industry & Sector A fast-growing IT services and data analytics consultancy serving Fortune 500 clients across banking, retail and healthcare relies on high-quality data pipelines to fuel BI and AI initiatives. We are hiring an on-site ETL Test Engineer to ensure the reliability, accuracy and performance of these mission-critical workloads. Role & Responsibilities Develop and maintain end-to-end test plans for data extraction, transformation and loading processes across multiple databases and cloud platforms. Create reusable SQL queries and automation scripts to validate data completeness, integrity and historical load accuracy at scale. Set up data-comparison, reconciliation and performance tests within Azure DevOps or Jenkins CI pipelines for nightly builds. Collaborate with data engineers to debug mapping issues, optimise jobs and drive defect resolution through Jira. Document test artefacts, traceability matrices and sign-off reports to support regulatory and audit requirements. Champion best practices for ETL quality, mentoring junior testers on automation frameworks and agile rituals. Skills & Qualifications Must-Have 5 years specialised in ETL testing within enterprise data warehouse or lakehouse environments. Hands-on proficiency in advanced SQL, joins, window functions and data profiling. Experience testing Informatica PowerCenter, SSIS, Talend or similar ETL tools. Exposure to big data ecosystems such as Hadoop or Spark and cloud warehouses like Snowflake or Redshift. Automation skills using Python or Shell with Selenium or pytest frameworks integrated into CI/CD. Strong defect management and stakeholder communication skills. Preferred Knowledge of BI visualisation layer testing with Tableau or Power BI. Performance benchmarking for batch loads and CDC streams. ISTQB or equivalent testing certification. Benefits & Culture Highlights Work on high-impact data programmes for global brands using modern cloud technologies. Clear career ladder with sponsored certifications and internal hackathons. Collaborative, merit-driven culture that values innovation and continuous learning. Location: On-site, India. Candidates must be willing to work from client premises and collaborate closely with cross-functional teams. Skills: sql,shell,informatica powercenter,defect tracking,stakeholder communication,agile,pytest,redshift,ssis,defect management,snowflake,selenium,talend,python,etl testing,test automation,spark,hadoop,data warehousing

Posted 1 month ago

Apply

5.0 years

10 - 30 Lacs

Hyderabad, Telangana, India

On-site

Industry & Sector A fast-growing IT services and data analytics consultancy serving Fortune 500 clients across banking, retail and healthcare relies on high-quality data pipelines to fuel BI and AI initiatives. We are hiring an on-site ETL Test Engineer to ensure the reliability, accuracy and performance of these mission-critical workloads. Role & Responsibilities Develop and maintain end-to-end test plans for data extraction, transformation and loading processes across multiple databases and cloud platforms. Create reusable SQL queries and automation scripts to validate data completeness, integrity and historical load accuracy at scale. Set up data-comparison, reconciliation and performance tests within Azure DevOps or Jenkins CI pipelines for nightly builds. Collaborate with data engineers to debug mapping issues, optimise jobs and drive defect resolution through Jira. Document test artefacts, traceability matrices and sign-off reports to support regulatory and audit requirements. Champion best practices for ETL quality, mentoring junior testers on automation frameworks and agile rituals. Skills & Qualifications Must-Have 5 years specialised in ETL testing within enterprise data warehouse or lakehouse environments. Hands-on proficiency in advanced SQL, joins, window functions and data profiling. Experience testing Informatica PowerCenter, SSIS, Talend or similar ETL tools. Exposure to big data ecosystems such as Hadoop or Spark and cloud warehouses like Snowflake or Redshift. Automation skills using Python or Shell with Selenium or pytest frameworks integrated into CI/CD. Strong defect management and stakeholder communication skills. Preferred Knowledge of BI visualisation layer testing with Tableau or Power BI. Performance benchmarking for batch loads and CDC streams. ISTQB or equivalent testing certification. Benefits & Culture Highlights Work on high-impact data programmes for global brands using modern cloud technologies. Clear career ladder with sponsored certifications and internal hackathons. Collaborative, merit-driven culture that values innovation and continuous learning. Location: On-site, India. Candidates must be willing to work from client premises and collaborate closely with cross-functional teams. Skills: sql,shell,informatica powercenter,defect tracking,stakeholder communication,agile,pytest,redshift,ssis,defect management,snowflake,selenium,talend,python,etl testing,test automation,spark,hadoop,data warehousing

Posted 1 month ago

Apply

5.0 years

10 - 30 Lacs

Mumbai Metropolitan Region

On-site

Industry & Sector A fast-growing IT services and data analytics consultancy serving Fortune 500 clients across banking, retail and healthcare relies on high-quality data pipelines to fuel BI and AI initiatives. We are hiring an on-site ETL Test Engineer to ensure the reliability, accuracy and performance of these mission-critical workloads. Role & Responsibilities Develop and maintain end-to-end test plans for data extraction, transformation and loading processes across multiple databases and cloud platforms. Create reusable SQL queries and automation scripts to validate data completeness, integrity and historical load accuracy at scale. Set up data-comparison, reconciliation and performance tests within Azure DevOps or Jenkins CI pipelines for nightly builds. Collaborate with data engineers to debug mapping issues, optimise jobs and drive defect resolution through Jira. Document test artefacts, traceability matrices and sign-off reports to support regulatory and audit requirements. Champion best practices for ETL quality, mentoring junior testers on automation frameworks and agile rituals. Skills & Qualifications Must-Have 5 years specialised in ETL testing within enterprise data warehouse or lakehouse environments. Hands-on proficiency in advanced SQL, joins, window functions and data profiling. Experience testing Informatica PowerCenter, SSIS, Talend or similar ETL tools. Exposure to big data ecosystems such as Hadoop or Spark and cloud warehouses like Snowflake or Redshift. Automation skills using Python or Shell with Selenium or pytest frameworks integrated into CI/CD. Strong defect management and stakeholder communication skills. Preferred Knowledge of BI visualisation layer testing with Tableau or Power BI. Performance benchmarking for batch loads and CDC streams. ISTQB or equivalent testing certification. Benefits & Culture Highlights Work on high-impact data programmes for global brands using modern cloud technologies. Clear career ladder with sponsored certifications and internal hackathons. Collaborative, merit-driven culture that values innovation and continuous learning. Location: On-site, India. Candidates must be willing to work from client premises and collaborate closely with cross-functional teams. Skills: sql,shell,informatica powercenter,defect tracking,stakeholder communication,agile,pytest,redshift,ssis,defect management,snowflake,selenium,talend,python,etl testing,test automation,spark,hadoop,data warehousing

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies