Jobs
Interviews

1529 Talend Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : GCP Dataflow Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BTech Summary :As a Data Platform Architect, you will architect the data platform blueprint and implement the design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your day will involve designing and implementing data platform components, ensuring seamless integration across systems and data models. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data platform architecture design and implementation- Ensure seamless integration between data platform components- Provide guidance and support to Integration Architects and Data Architects Professional & Technical Skills: - Must To Have Skills: Proficiency in GCP Dataflow- Strong understanding of cloud data architecture- Experience with data modeling and data integration- Hands-on experience with data platform implementation- Knowledge of data governance and security practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in GCP Dataflow- This position is based at our Bengaluru office- A BTech degree is required Qualification BTech

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurgaon

On-site

Job Summary: Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrow—with a positive impact on the business, society and the planet. Together, let’s innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on client’s current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with client’s systems. Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Kanerika Who we are: Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth. Locations: We are located in Hyderabad, Indore and Ahmedabad (India). What You Will Do: As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment - Develop and maintain enterprise data governance strategies, policies, and standards. - Align governance with business goals: compliance, analytics, and decision-making. - Collaborate across business, IT, legal, and compliance teams for role alignment. - Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation - Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. - Optimize Purview setup for large-scale environments (50TB+). - Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. - Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management - Design metadata repositories and maintain business glossaries and data dictionaries. - Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. - Ensure lineage mapping (ADF → Synapse → Power BI) and impact analysis. 4. Data Classification & Security Governance - Define classification rules and sensitivity labels (PII, PCI, PHI). - Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. - Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management - Define KPIs and dashboards to monitor data quality across domains. - Collaborate on rule design, remediation workflows, and exception handling. - Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship - Maintain business glossary with domain owners and stewards in Purview. - Enforce approval workflows, standard naming, and steward responsibilities. - Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration - Automate governance processes using PowerShell, Azure Functions, Logic Apps. - Create pipelines for ingestion, lineage, glossary updates, tagging. - Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance - Set up dashboards for audit logs, compliance reporting, metadata coverage. - Oversee data lifecycle management across its phases. - Support internal and external audit readiness with proper documentation. Tools & Technologies: - Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog - Microsoft Purview capabilities: 1. Label creation & policy setup 2. Auto-labeling & DLP 3. Compliance Manager, Insider Risk, Records & Lifecycle Management 4. Unified Catalog, eDiscovery, Data Map, Audit, Compliance alerts, DSPM Required Qualifications: - 7+ years of experience in data governance and data management. - Proficient in Microsoft Purview and Informatica data governance tools. - Strong in metadata management, lineage mapping, classification, and security. - Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. - Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs. - Skilled in bridging technical governance with business and compliance goals.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Hiring ETL Tester - eNoah iSolution India Pvt Ltd Location : Chennai Exp : 5 to 8 years Key Responsibilities: Review ETL design documents and understand data flows, mapping documents, and business requirements Develop comprehensive test plans, test cases, and test scripts for validating ETL processes Perform data validation and data quality testing at various stages of the ETL cycle Write and execute SQL queries to verify data transformation logic, source-to-target data mapping, and business rules Identify, troubleshoot, and document data anomalies, discrepancies, and system defects Work closely with development teams to replicate, debug, and resolve issues Participate in daily stand-ups, sprint planning, and defect triage meetings Communicate clearly with stakeholders and provide timely updates on test status and results Contribute to the development and maintenance of automated ETL testing solutions (optional, based on project) Ensure compliance with testing standards and best practices across data projects Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field 5+ years of hands-on experience in ETL testing or data validation roles Strong knowledge of SQL and ability to write complex queries for data verification Familiarity with ETL tools (e.g., Informatica, Talend, DataStage, SSIS, etc.) Experience working with large datasets and relational databases (Oracle, SQL Server, PostgreSQL, etc.) Excellent problem-solving skills with a keen eye for identifying data quality issues Strong analytical and critical thinking skills Clear and concise verbal and written communication skills for cross-functional collaboration Ability to work in agile/scrum environments with fast-changing priorities Nice to Have: Experience with test automation for ETL pipelines using tools like Selenium, PyTest, or Apache Airflow validation scripts Familiarity with cloud platforms such as AWS, Azure, or GCP Exposure to BI tools like Power BI, Tableau, or Looker Understanding of data warehousing and data lake concepts

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

Mumbai, Maharashtra

On-site

Job Description Role Objective: Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend). Location- Mumbai Years of Experience - 3-5 yrs Roles & Responsibilities: Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution. Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape. Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance. Understanding Coding Standards, Compliance &Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices. Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution. Technical Skills: Core Tool exposure – Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.) Core Concepts – ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement Cloud exposure – Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.) SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding Soft Skills- Very good communication and presentation skills Must be able to articulate the thoughts and convince key stakeholders Should be able to guide and upskill team members Good to Have: Programming Language: Knowledge and hands-on experience with languages like Python and R. Relevant certifications related to the role Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: Up to ₹130,000.00 per month Benefits: Flexible schedule Schedule: Fixed shift Application Question(s): This is a contract onsite opportunity, are you okay with it? Experience: ETL: 3 years (Required) Talend: 3 years (Required) Cloud Service: 3 years (Required) SQL: 3 years (Required) Location: Mumbai, Maharashtra (Required) Work Location: In person

Posted 1 month ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Quality Engineer Experience: 4+ Years Location: Pune Job Summary: We are seeking experienced ETL Testers who are ready to work as Data Quality Engineers (DQE) , possessing strong programming experience in Advanced SQL and Python (or Java). Candidates should have hands-on testing experience, especially in data validation , functional testing , and be comfortable working in CI/CD environments. Exposure to BDD frameworks (like Selenium, Cucumber) and modern software testing practices is expected. Key Responsibilities: Develop and execute complex SQL queries for data validation , data profiling , and test case automation . Perform ETL testing on large-scale datasets to ensure data integrity across systems. Use Python or Java to build automation scripts for data quality checks. Write and maintain test scripts using BDD frameworks (Selenium, Cucumber) where applicable. Collaborate with Data Engineers and Developers for continuous integration and deployment ( CI/CD ) pipeline validations. Implement testing strategies for structured and unstructured data sources. Contribute to the identification and resolution of data quality issues and anomalies. Work with cross-functional teams to define test plans, data scenarios, and ensure full test coverage. Participate in code reviews and optimize SQL queries for performance improvements. Required Technical Skills: ✅ Must Have: Advanced SQL – strong proficiency in: Window Functions (e.g., RANK, ROW_NUMBER, LEAD/LAG, PARTITION BY) CTEs (Recursive and Multiple) Subqueries (Scalar, Correlated, EXISTS/NOT EXISTS) Analytical Functions (CUME_DIST, NTILE, PERCENT_RANK) Full-text search , hierarchical queries , and query optimization Programming Skills – Proficiency in Python or Java Strong functional testing background (manual + automation) Working knowledge of CI/CD pipelines Good to Have: Exposure to Selenium , Cucumber , or any BDD framework Experience in working with ETL pipelines , data lakes , or big data systems Knowledge of data governance or data profiling tools (e.g., Informatica DQ, Talend, etc.) Soft Skills & Competencies: Strong analytical and problem-solving skills Ability to communicate technical concepts clearly and effectively Self-driven and able to work independently as well as part of a distributed team Quick learner with adaptability to new technologies and frameworks Other Details: Candidates should be open to working in a hybrid model with 3 days in office . Immediate to 30-day joiners preferred. Flexibility to coordinate with distributed teams across locations.

Posted 1 month ago

Apply

48.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are looking for a detail-oriented and technically strong ETL Quality Engineer to join our data engineering and QA team. The ideal candidate will be responsible for ensuring the accuracy, integrity, and reliability of data pipelines and ETL processes. You will work closely with data engineers, business analysts, and developers to validate and verify end-to-end data flows and transformations. Key Responsibilities Review and analyze data requirements, source-to-target mappings, and business rules. Design, develop, and execute comprehensive ETL test plans, test cases, and test scripts. Perform data validation, transformation testing, and reconciliation across source and target systems. Identify and document defects, inconsistencies, and data quality issues. Validate performance of ETL jobs and data pipelines under various workloads. Participate in code reviews, defect triage meetings, and QA strategy planning. Use SQL to query, validate, and compare large datasets across environments. Maintain and enhance test automation frameworks for data pipeline validation. Required Technical Skills Strong experience with ETL testing tools such as Informatica, Talend, SSIS, DataStage, or equivalent. Proficiency in SQL for complex queries, joins, aggregations, and data validation. Experience working with data warehouses, data lakes, or cloud-based data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse). Hands-on experience with test automation tools and frameworks related to data testing (e.g., Python, PyTest, DBT, Great Expectations). Knowledge of data profiling, data cleansing, and data governance practices. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (e.g., Jenkins, Azure DevOps). Exposure to API testing for data integrations and ingestion pipelines (Postman, SoapUI, REST/SOAP APIs). Candidate Profile Bachelors degree in Computer Science, Information Technology, or a related field. 48 years of experience in data quality engineering or ETL QA roles. Excellent analytical and problem-solving skills. Strong communication and documentation abilities. Experience working in Agile/Scrum teams. Preferred Qualifications Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Big Data ecosystems (e.g., Hadoop, Spark, Hive). DataOps or DevOps experience is a plus. Certification in data or QA-related domains (ISTQB, Microsoft, AWS Data Analytics, etc.) Why Join Us? Work with modern data platforms and contribute to enterprise data quality initiatives. Be a key player in ensuring trust and confidence in business-critical data. Collaborate with cross-functional data, engineering, and analytics teams. Enjoy a culture that promotes growth, learning, and innovation (ref:hirist.tech)

Posted 1 month ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Attention All Data Folks!!! We are Hiring Sr/ Lead Data Engineer in Indore, MP (Hybrid Model) Below is the JD for the reference Sr/Lead Data Engineer Indore, MP (Hybrid) Full Time [Key Responsibilities]: Gather and assemble large, complex sets of data that meet non-functional and functional business requirements.Skills: SQL, Python, R, Data Modeling, Data Warehousing, AWS (S3, Athena) Create new data pipelines or enhance existing pipelines to accommodate non-standard data formats from customers.Skills: ETL Tools (e.g., Apache NiFi, Talend), Python (Pandas, PySpark), AWS Glue, JSON, XML, YAML Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.Skills: Apache Airflow, Terraform, Kubernetes, AWS Lambda, CI/CD pipelines, Docker Build and maintain required infrastructure for optimal extraction, transformation, and loading (ETL) of data from various data sources using AWS and SQL technologies.Skills: SQL, AWS Redshift, AWS RDS, EMR (Elastic MapReduce), Snowflake Use existing methods or develop new tools/methods to analyze the data and perform required data sanity validations to ensure completeness and accuracy as per technical and functional requirements.Skills: Python (NumPy, Pandas), Data Validation Tools, Tableau, Power BI Work with stakeholders including Customer Onboarding, Delivery, Product, and other functional teams, assisting them with any data-related technical or infrastructure-related issues.Skills: Stakeholder Communication, JIRA, Agile Methodologies Provide actionable insights into key data metrics (volumes, trends, outliers, etc.), highlight any challenges/improvements, and provide recommendations and solutions to relevant stakeholders.Skills: Data Analysis, Data Visualization Tools (Tableau, Looker), Advanced Excel Coordinate with the Technical Program Manager (TPM) to prioritize discovered issues in the Data Sanity Report and own utility communications Skills: Project Management Tools, Reporting Tools, Clear Documentation Practices. [About Ccube] Ccube: Pioneering Data-Driven Solutions in the Cloud Ccube is a specialized firm that delivers measurable results across a wide range of industries by focusing exclusively on Data and Artificial Intelligence within Cloud environments. We leverage cutting-edge technologies and innovative strategies to help our clients harness the power of data and achieve their business objectives. Core Competencies: Strategic Planning and Design of Data Systems: We collaborate with our clients to develop comprehensive data strategies and design robust data systems that align with their business goals. Our team of experts provides guidance on data architecture, data governance, and data management best practices. Development and Unification of Data Frameworks: We build and integrate data frameworks that enable seamless data flow and analysis. Our solutions facilitate data ingestion, data transformation, and data storage, ensuring data is readily available for business intelligence and decision-making. Advanced Data Analysis and Artificial Intelligence Applications: We employ sophisticated data analysis techniques and artificial intelligence algorithms to extract valuable insights from data. Our solutions include predictive modeling, machine learning, and natural language processing, enabling our clients to make data-driven decisions and optimize their operations. Cloud Computing, Data Operations, and Machine Learning Operations: We leverage the scalability and flexibility of cloud computing to deliver efficient and cost-effective data solutions. Our team of experts manages data operations and machine learning operations, ensuring seamless integration and optimal performance. Organizational Principles at Ccube: At Ccube, we are guided by a set of core principles that shape our culture and drive our success: Efficiency: We strive to maximize efficiency by optimizing resource utilization and streamlining processes. Client Satisfaction: We are committed to providing exceptional service and exceeding our clients' expectations. Innovation: We embrace innovation and continuously explore new technologies and approaches to deliver cutting-edge solutions. Humility: We maintain professional modesty and recognize that there is always room for improvement. Employee Advantages: Ccube offers a stimulating and rewarding work environment with numerous benefits for our employees: Dynamic Startup Environment: We provide a fast-paced and entrepreneurial environment where employees can learn, grow, and make a significant impact. Career Growth Opportunities: We offer ample opportunities for career advancement and professional development. Performance-Based Incentives: We reward high-performing employees with competitive compensation and performance-based bonuses. Equity Participation: We offer equity participation options to eligible employees, providing them with ownership and a stake in the company's success. Professional Development Reimbursement: We encourage continuous learning and reimburse employees for eligible professional development expenses. Join Ccube and be part of a team that is shaping the future of data and AI in the cloud. Powered by JazzHR oE7x9xRa5q

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Summary: Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrow—with a positive impact on the business, society and the planet. Together, let’s innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on client’s current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with client’s systems. Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

2.0 years

0 Lacs

Chandigarh, India

On-site

Company Profile Since year 2003, Oceaneering’s India Center has been an integral part of operations for Oceaneering’s robust product and service offerings across the globe. This center caters to diverse business needs, from oil and gas field infrastructure, subsea robotics to automated material handling & logistics. Our multidisciplinary team offers a wide spectrum of solutions, encompassing Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. In addition to these technical functions, Oceaneering India Center plays host to several crucial business functions, including Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our world class infrastructure in India includes modern offices, industry-leading tools and software, equipped labs, and beautiful campuses aligned with the future way of work. Oceaneering in India as well as globally has a great work culture that is flexible, transparent, and collaborative with great team synergy. At Oceaneering India Center, we take pride in “Solving the Unsolvable” by leveraging the diverse expertise within our team. Join us in shaping the future of technology and engineering solutions on a global scale. Position Summary Responsible for identifying data quality issues, analyzing data sets, and working with various teams to improve data quality across the organization. Duties And Responsibilities Perform data profiling and analysis to assess the quality of data across different systems and sources. Identify and report data quality issues, including missing, duplicate, or inconsistent data, and recommend corrective actions. Monitor data quality KPIs (e.g., completeness, accuracy, timeliness, consistency) and track improvements over time. Implement data quality checks and validation rules to ensure that data meets the organization’s standards. Collaborate with data stewards, business analysts, and other teams to perform data cleansing activities, including data correction, enrichment, and de-duplication. Support the development and implementation of data standardization practices across the organization to ensure consistency in data entry and processing. Conduct root cause analysis of data quality issues and work closely with technical teams to identify and resolve the underlying problems. Track recurring data quality issues and develop long-term strategies to prevent them from reoccurring. Work with data governance and infrastructure teams to implement automated processes to improve data quality. Support data governance initiatives by helping define and enforce data quality standards, policies, and procedures. Document data quality processes and contribute to data governance documentation, including data dictionaries and metadata management. Collaborate with data engineering, data management, business intelligence, and IT teams to implement data quality best practices. Work with business units to understand their data needs and ensure data quality processes align with business objectives. Qualifications Bachelor’s degree in data management, Information Systems, Computer Science, Statistics, or a related field. Certification in Data Quality, Data Governance, or a similar area is a plus. Experience : 2+ years of experience in data quality analysis, data management, or data governance. Experience with data profiling, cleansing, and validation tools (e.g., Informatica Data Quality, Talend, Microsoft Purview, Trillium). Strong proficiency in SQL for querying and analyzing large datasets. Knowledge, Skills, Abilities, And Other Characteristics Strong understanding of data quality dimensions (accuracy, completeness, consistency, uniqueness, and timeliness). Experience with data profiling and analysis techniques to identify data anomalies and issues. Ability to perform data validation, root cause analysis, and data cleansing tasks. Proficiency in data visualization and reporting tools like Tableau, Power BI, or Excel. Strong analytical skills with the ability to problem-solve and make data-driven recommendations. Excellent attention to detail and ability to handle complex data sets. Preferred Qualifications: Experience with data governance tools or data catalog systems (e.g., Collibra, Alation). Familiarity with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). Knowledge of data privacy and compliance regulations (GDPR, CCPA) and how they impact data quality practices. How To Apply Oceaneering’s policy is to provide equal employment opportunity to all applicants. How To Apply Regular full-time employees who apply will be considered along with external candidates. Employees with less than six months with their current position are not eligible to apply for job postings. Please discuss your interest in the position with your current manager/supervisor prior to submitting your completed application. It is highly recommended to apply through the PeopleSoft or Oceanet portals. How To Apply In addition, we make a priority of providing learning and development opportunities to enable employees to achieve their potential and take charge of their future. As well as developing employees in a specific role, we are committed to lifelong learning and ongoing education, including developing people skills and identifying future supervisors and managers. Every month, hundreds of employees are provided training, including HSE awareness, apprenticeships, entry and advanced level technical courses, management development seminars, and leadership and supervisory training. We have a strong ethos of internal promotion. We can offer long-term employment and career advancement across countries and continents. Working at Oceaneering means that if you have the ability, drive, and ambition to take charge of your future-you will be supported to do so and the possibilities are endless.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Apache Spark, Talend ETLMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams, make team decisions, and provide solutions to problems. Engaging with multiple teams, you will contribute to key decisions and provide solutions for your immediate team and across multiple teams. In this role, you will have the opportunity to showcase your creativity and technical expertise in designing and building applications. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications to meet business process and application requirements- Collaborate with cross-functional teams to gather and define application requirements- Develop and implement software solutions using the Databricks Unified Data Analytics Platform- Perform code reviews and ensure adherence to coding standards- Troubleshoot and debug applications to identify and resolve issues- Optimize application performance and ensure scalability- Document technical specifications and user manuals for applications- Stay updated with emerging technologies and industry trends- Train and mentor junior developers to enhance their technical skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark, Apache Spark, Talend ETL- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Databricks Unified Data Analytics Platform, Apache Spark, Talend ETLMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring the applications are aligned with the business needs. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Design, build, and configure applications based on business process and application requirements- Collaborate with the team to develop and implement solutions- Ensure the applications are aligned with the business needs Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Good To Have Skills: Experience with Talend ETL, Apache Spark, Databricks Unified Data Analytics Platform- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark- This position is based in Mumbai- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job role - Datawarehouse BI testing Location - Chennai / Hyderabad / Bangalore / Mumbai / Ahmedabad / Indore Experience - 4 to 10 years 4+ years of experience in software testing with a strong focus on Data Warehousing, ETL testing, and BI testing. Solid understanding of data warehouse concepts (star schema, snowflake schema, OLAP, and dimensional modeling). Proficiency in writing complex SQL queries for data validation, reconciliation, and transformation testing. Hands-on experience testing ETL tools such as Informatica, Talend, Apache Airflow, or dbt. Expertise in testing and validating reports and dashboards built on BI tools: Tableau Power BI ThoughtSpot Familiarity with cloud-based DWH platforms like Snowflake, Databricks, AWS Redshift, or Azure Synapse. Experience with defect management tools such as Jira, TestRail, or Azure DevOps. Strong analytical skills with the ability to troubleshoot data quality and performance issues. Experience with performance testing and optimization for BI dashboards and large-scale datasets. Excellent communication, leadership, and stakeholder management skills.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Andhra Pradesh, India

On-site

Sr Devloper with special emphasis and experience of 6 to 8 years on Pyspark, Python and SQL along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans. 3+ years of sound knowledge on Pyspark to implement ETL logics. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Ahmedabad

Work from Office

Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role We are seeking a skilled Talend Developer with expertise in Power BI development and SQL Server to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes using Talend, creating insightful data visualizations with Power BI, and is an expert in writing stored procedures/queries on MS SQL Server databases. This role requires coordination with multiple geographically dispersed groups (Business partners, Subject Matter Experts, Senior Technologists, Analyst as well as infrastructure teams) across the enterprise to implement solutions. What You'll Do Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources. Create and maintain data visualizations and dashboards using Power BI to provide actionable insights to stakeholders. Write high performance queries on SQL Server databases, ensuring data integrity, performance, and security Collaborate with cross-functional teams to gather requirements, design solutions, and implement data integration and reporting solutions. Troubleshoot and resolve issues related to ETL processes, data visualizations, and database performance Collaborates with other team members and analysts through the delivery cycle. Participates in an Agile delivery team that builds high quality and scalable work products. Assists in the evaluation of upcoming technologies and contributes to the overall solution design. Supports production releases and maintenance windows working with the Operations team Qualifications Bachelor’s degree in computer science, Information Technology, or a related field. Talents Needed For Success 4+ years in writing ETL processes Proven experience as a Talend Developer, with a strong understanding of ETL processes and data integration. Proficiency in Power BI development, including creating dashboards, reports, and data models. Expertise in SQL Server, including database design, optimization, and performance tuning Strong understanding of agile processes (Kanban and Scrum) and a working knowledge of JIRA is required Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills For Success Talend Expertise: Proficiency in using Talend Studio for data integration, data quality, files manipulation. This includes designing and developing ETL processes, creating and managing Talend jobs, and using Talend components for data transformation and integration Data Integration Knowledge in Talend: Understanding of data integration concepts and best practices. This includes experience with data extraction, transformation, and loading (ETL) processes, as well as knowledge of data warehousing and data modeling. Architectural Experience: Experience in designing and implementing ETL architecture. This includes defining ETL workflows, designing data pipelines, and ensuring data quality and consistency. Knowledge of ETL performance optimization and troubleshooting is also essential. Database Skills: Proficiency in working with various databases, including MS SQL and/or Oracle databases. This includes writing complex SQL queries, understanding database schemas, and performing data migrations. Integration Knowledge: Experience with integrating Talend with various data sources and targets. This includes knowledge of APIs, web services, and other data integration methods. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud) is also beneficial. Version Control and Collaboration: Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). This is important for managing code changes, collaborating with team members, and tracking project progress. Job Scheduling and Automation: Experience with job scheduling and automation tools. This includes setting up and managing Talend jobs using schedulers like Talend Administration Center (TAC), Autosys or third-party tools to automate ETL workflows. Data Visualization: Ability to create visually appealing and insightful reports and dashboards. This involves selecting appropriate visualizations, designing layouts, and using custom visuals when necessary using Power BI Power Query: Expertise in using Power Query for data transformation and preparation. This involves cleaning, merging, and shaping data from various source Integration with Other Tools: Familiarity with integrating Power BI with other tools and platforms, such as Excel, SharePoint, and other on prem systems Expertise in scripting languages such as Python, and Shell/Batch programming is a plus We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 1 month ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 1 month ago

Apply

3.0 years

4 - 10 Lacs

Coimbatore

Remote

Industry: IT Qualification: Any Degree Required Skills: Python, Pandas, SQL Working Shift: 2PM to 11PM IST City: Coimbatore Country: India Name of the position: Data Engineer Location: Remote No. of resources needed : 01 Mode: Contract (2 Months with Possible Extension) Years of experience: 3+ Years Shift : UK shift Job Summary: We are looking for a highly motivated and detail-oriented Data Engineer with a strong background in data cleansing, Python scripting, and SQL to join our team. The ideal candidate will play a critical role in ensuring data quality, transforming raw datasets into actionable insights, and supporting data-driven decision-making across the organization. Key Responsibilities: Design and implement efficient data cleansing routines to remove duplicates, correct anomalies, and validate data integrity. Write robust Python scripts to automate data processing, transformation, and integration tasks. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Work closely with data analysts, business stakeholders, and engineering teams to understand data requirements and deliver clean, structured datasets. Build and maintain data pipelines that support large-scale data processing. Monitor data workflows and troubleshoot issues to ensure accuracy and reliability. Contribute to documentation of data sources, transformations, and cleansing logic. Requirements: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field. 3+ years of hands-on experience in data engineering, with a focus on data quality and cleansing. Strong proficiency in Python, including libraries like Pandas and NumPy. Expert-level knowledge of SQL and working with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Familiarity with data profiling tools and techniques. Excellent problem-solving skills and attention to detail. Good communication and documentation skills. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, GCP) and data services (e.g., S3, BigQuery, Redshift). Knowledge of ETL tools like Apache Airflow, Talend, or similar. Exposure to data governance and data cataloging practices.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru

On-site

Job Applicant Privacy Notice APPLICATION DEVELOPER Publication Date: Jun 20, 2025 Ref. No: 532101 Location: Bangalore, IN looking for a Data Engineer to join our team and bring the analytics practice to the next level. We are looking for a motivated person who thrives in a dynamic and challenging environment, who loves working with tools on the edge, who has no problem switching between multiple programming languages, and who is able to find out-of-the-box solutions to solve complex problems. In this role, you will be at the heart of the definition and implementation of world-class analytics solutions, and you will contribute to set data as a strategic advantage for BRP. Responsibilities Design, develop, implement and support robust ETL/ELT/Data Pipelining solutions Coordinate with multiple development teams to achieve delivery objectives Provide support in requirements definition, estimation, development and delivery of robust and scalable solutions Development and support of real-time data ingestion processes from various data sources Development and support of data models optimized for business intelligence usage Build integrations with APIs from external providers such as Google, Facebook, Salesforce, SAP and others Adhering to industry standards and laws such as GDPR and SOX Be a leader in best-practices definition and creative thinking Required Skills Master Degree in Business Intelligence or equivalent Solid and demonstrated experience on following technologies: Snowflake DBT Talend and/or Azure Data Factory Microsoft SQL Server Power BI Fluent in various programming languages such as: T-SQL Python Javascript / Node.js Java Understands and puts into practice data modeling, design and development of solid ETL/ELT data pipelines Fluent in writing, executing and optimizing complex SQL queries Experience implementing API Service architectures ( REST) Develop clean and maintainable code in a CI-CD Environment Experience in using cloud BI technologies such as Azure or similar Experience in translating business requirements into advanced data models able to fulfill analysts and data scientists requirements Experience in data profiling Experience in working within an agile team to build big data / analytics solutions Strong interpersonal relations, motivated and loves to work on multiple challenging projects Strong communication skills, both speaking or writing Fluent in French and English Open-minded and able to adapt to new ways of working (data vault, event-driven architecture, unstructured data, self-service analytics, etc..) Well organized and able to self-prioritize, sometimes with conflictual deadlines Strong communication skills, both speaking or writing, in both french and english Continuously seeks for improvements and craves to put hands on new technologies

Posted 1 month ago

Apply

1.0 - 9.0 years

5 - 8 Lacs

Bengaluru

On-site

Job requisition ID :: 84728 Date: Jun 22, 2025 Location: Bengaluru Designation: Consultant Entity: Technology & Transformation-EAD: ETL Testing-Analyst/Consultant/Senior Consultant Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Develop and execute automated test cases for ETL processes. Validate data transformation, extraction, and loading accuracy. Collaborate with data engineers and QA teams to understand ETL workflows. Identify and document defects and inconsistencies. Maintain test documentation and support manual testing efforts. Design and implement automated ETL test scripts and frameworks. Validate end-to-end data flows and transformation logic. Collaborate with data architects, developers, and QA teams. Integrate ETL testing into CI/CD pipelines where applicable. Analyze test results and troubleshoot data issues. Lead the architecture and development of advanced ETL automation frameworks. Drive best practices in ETL testing and data quality assurance. Mentor and guide junior consultants and analysts. Collaborate with stakeholders to align testing strategies with business goals. Integrate ETL testing within DevOps and CI/CD pipelines. Desired Qualifications 1 to 9 years experience in ETL testing and automation. Knowledge of ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database querying. Basic scripting or programming skills (Python, Shell, etc.). Good analytical and communication skills. Strong SQL skills and experience with ETL tools like Informatica, Talend, or DataStage. Proficiency in scripting languages for automation (Python, Shell, etc.). Knowledge of data warehousing concepts and best practices. Strong problem-solving and communication skills. Expert knowledge of ETL tools and strong SQL proficiency. Experience with automation scripting and data validation techniques. Strong leadership, communication, and stakeholder management skills. Familiarity with big data technologies and cloud platforms is a plus. Location and way of working: Base location: Bangalore This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 month ago

Apply

1.0 years

6 - 9 Lacs

Noida

On-site

Job Description Job ID ANALY014395 Employment Type Regular Work Style on-site Location Noida,UP,India Role Analytics Consultant I Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities: Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications: 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure

Posted 1 month ago

Apply

3.0 - 8.0 years

16 - 20 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

We're Hiring: Data Governance Developer Microsoft Purview, Locations: Hyderabad / Indore / Ahmedabad (Work from Office) Experience: 4-6 Years Budget: (Depending on experience & skills) Apply by sharing your resume with: Current CTC Expected CTC Notice Period Preferred Location Email your profile to: navaneetha@suzva.com Contact: +91 90329 56160 Role Overview As a Data Governance Developer at Kanerika, you will be responsible for developing and managing robust metadata, lineage, and compliance frameworks using Microsoft Purview and other leading tools. Youll work closely with engineering and business teams to ensure data integrity, regulatory compliance, and operational transparency. Key Responsibilities Set up and manage Microsoft Purview: accounts, collections, RBAC, and policies. Integrate Purview with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule & monitor metadata scanning, classification, and lineage tracking jobs. Build ingestion workflows for technical, business, and operational metadata. Tag, enrich, and organize assets with glossary terms and metadata. Automate lineage, glossary, and scanning processes via REST APIs, PowerShell, ADF, and Logic Apps. Design and enforce classification rules for PII, PCI, PHI. Collaborate with domain owners for glossary and metadata quality governance. Generate compliance dashboards and lineage maps in Power BI. Tools & Technologies Governance Platforms: Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog Integration Tools: Azure Data Factory, dbt, Talend Automation & Scripting: PowerShell, Azure Functions, Logic Apps, REST APIs Compliance Areas in Purview: Sensitivity Labels, Policy Management, Auto-labeling Data Loss Prevention (DLP), Insider Risk Mgmt, Records Management Compliance Manager, Lifecycle Mgmt, eDiscovery, Audit DSPM, Information Barriers, Unified Catalog Required Qualifications 4-6 years of experience in Data Governance / Data Management. Hands-on with Microsoft Purview, especially lineage and classification workflows. Strong understanding of metadata management, glossary governance, and data classification. Familiarity with Azure Data Factory, dbt, Talend. Working knowledge of data compliance regulations: GDPR, CCPA, SOX, HIPAA. Strong communication skills to collaborate across technical and non-technical teams.

Posted 1 month ago

Apply

1.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

Piscis Networks is looking for TAC Support Engineer to join our dynamic team and embark on a rewarding career journey Responding to customer inquiries and resolving technical issues via phone, email, or chat Conducting diagnostic tests to identify the root cause of customer issuesProviding technical guidance to customers and walking them through solutions to resolve their problems Collaborating with development teams to escalate and resolve complex technical issues Maintaining accurate records of customer interactions and issue resolutions in a CRM system Participating in the development and delivery of customer training and support materials Communicating with customers and internal stakeholders to provide status updates on issue resolution Strong technical background and understanding of hardware and software systems Excellent communication and interpersonal skills Experience with CRM and ticketing systems

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies