Home
Jobs

585 Talend Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Kozhikode, Kerala, India

On-site

Linkedin logo

Job Description The role involves analyzing large and complex datasets to uncover trends, patterns, and actionable insights that support data-driven decision-making across the organization. Responsibilities include designing and building dashboards to visualize key retail KPIs and operational metrics for business stakeholders. The position also requires developing ad-hoc reports and performing deep-dive analyses to support cross-functional teams. Close collaboration with product managers and stakeholders is essential to gather reporting requirements and deliver customized data solutions. Requirements BSc/BCA/MSc/MCA/BTech/MTech 3+ Years of Experience in Data Engineering Strong SQL skills and experience working with relational databases (e.g.MySQL, PostgreSQL) Proficiency in Python for data pipelines and scripting Proven ability to build reporting dashboards and visualizations for retail KPIs using tools such as Apache Superset, Metabase, Looker, Tableau, or Power BI Strong data analysis skills with the ability to extract insights from complex datasets Hands-on experience with ETL/ELT tools (e.g., Airflow, dbt, Talend, or custom pipelines) Familiarity with cloud platforms (AWS, GCP, or Azure) and data services like S3, Redshift, BigQuery, or Snowflake Familiarity with batch and streaming data pipelines (Kafka, Spark, etc.) Strong understanding of data modeling, warehousing, and performance optimization Version control with Git and experience working in a CI/CD environment Ability to write clean, modular, and well-documented code Soft Skills: Strong communication and collaboration skill Ability to work independently and drive tasks to completion Attention to detail and a problem-solving mindset Benefits Competitive Pay Performance Bonus Longevity Bonus Monthly Fun & Entertainments Programs Office Pantry filled with Tea & Snacks Paid Time Off Parental Leave Policy Medical Coverage - Insurance for Employee and Family PF / ESI Education Allowances Requirements BSc/BCA/MSc/MCA/BTech/MTech 3+ Years of Experience in Data Engineering Strong SQL skills and experience working with relational databases (e.g.MySQL, PostgreSQL) Proficiency in Python for data pipelines and scripting Proven ability to build reporting dashboards and visualizations for retail KPIs using tools such as Apache Superset, Metabase, Looker, Tableau, or Power BI Strong data analysis skills with the ability to extract insights from complex datasets Hands-on experience with ETL/ELT tools (e.g., Airflow, dbt, Talend, or custom pipelines) Familiarity with cloud platforms (AWS, GCP, or Azure) and data services like S3, Redshift, BigQuery, or Snowflake Familiarity with batch and streaming data pipelines (Kafka, Spark, etc.) Strong understanding of data modeling, warehousing, and performance optimization Version control with Git and experience working in a CI/CD environment Ability to write clean, modular, and well-documented code Soft Skills: Strong communication and collaboration skill Ability to work independently and drive tasks to completion Attention to detail and a problem-solving mindset Show more Show less

Posted 3 weeks ago

Apply

9.0 - 14.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

Hyderabad

Remote

Naukri logo

Job Title: Senior Data Engineer Location: Remote Job Type: Fulltime Experience Level: 7+ years About the Role: We are seeking a highly skilled Senior Data Engineer to join our team in building a modern data platform on AWS. You will play a key role in transitioning from legacy systems to a scalable, cloud-native architecture using technologies like Apache Iceberg, AWS Glue, Redshift, and Atlan for governance. This role requires hands-on experience across both legacy (e.g., Siebel, Talend, Informatica) and modern data stacks. Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows on AWS. Migrate legacy data solutions (Siebel, Talend, Informatica) to modern AWS-native services. Implement and manage a data lake architecture using Apache Iceberg and AWS Glue. Work with Redshift for data warehousing solutions including performance tuning and modelling. Apply data quality and observability practices using Soda or similar tools. Ensure data governance and metadata management using Atlan (or other tools like Collibra, Alation). Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions. Build scalable, secure, and high-performing data platforms supporting both batch and real-time use cases. Participate in defining and enforcing data engineering best practices. Required Qualifications: 7+ years of experience in data engineering and data pipeline development. Strong expertise with AWS services, especially Redshift, Glue, S3, and Athena. Proven experience with Apache Iceberg or similar open table formats (like Delta Lake or Hudi). Experience with legacy tools like Siebel, Talend, and Informatica. Knowledge of data governance tools like Atlan, Collibra, or Alation. Experience implementing data quality checks using Soda or equivalent. Strong SQL and Python skills; familiarity with Spark is a plus. Solid understanding of data modeling, data warehousing, and big data architectures. Strong problem-solving skills and the ability to work in an Agile environment.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Wissen Technology is Hirin g fo r Power BI Developer About Wissen Technology: Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges. Role Overview: We are seeking a skilled Power BI Developer to design and develop business intelligence solutions that turn data into actionable insights. You will collaborate with cross-functional teams to understand data requirements and build interactive dashboards, reports, and data models that support strategic decision-making. Experience : 3-6 Years Location: Bengaluru Key Responsibilities: Design, develop, and deploy Power BI reports and dashboards Connect Power BI to various data sources including SQL databases, Excel, APIs, and cloud platforms Create data models , DAX formulas , and measures for performance-optimized reports Understand business requirements and translate them into technical specs Automate report refreshes, implement row-level security, and maintain data accuracy Collaborate with stakeholders for UAT, feedback, and enhancements Troubleshoot and resolve reporting/data issues in a timely manner Required Skills: 3–6 years of hands-on experience in Power BI development Strong knowledge of DAX, Power Query (M Language), and data modeling Proficiency in writing complex SQL queries and working with RDBMS (MS SQL Server, Oracle, etc.) Experience working with Excel, CSV, and cloud-based data sources (Azure, AWS, etc.) Familiarity with data visualization best practices Strong communication and stakeholder management skills Preferred Skills Knowledge of Power Platform (PowerApps, Power Automate) Exposure to ETL tools (SSIS, Informatica, Talend) Experience with Agile/Scrum methodology Basic understanding of Python/R for data analysis is a plus Working knowledge of Azure Data Lake , Synapse Analytics , or Data Factory The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products. We offer an array of services including Core Business Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud Adoption, Mobility, Digital Adoption, Agile & DevOps, Quality Assurance & Test Automation. Over the years, Wissen Group has successfully delivered $1 billion worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right ’. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them with the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. Great Place to Work® Certification is recognized world over by employees and employers alike and is considered the ‘Gold Standard ’. Wissen Technology has created a Great Place to Work by excelling in all dimensions - High-Trust, High-Performance Culture, Credibility, Respect, Fairness, Pride and Camaraderie. Website : www.wissen.com LinkedIn : https ://www.linkedin.com/company/wissen-technology Wissen Leadership : https://www.wissen.com/company/leadership-team/ Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All Wissen Thought Leadership : https://www.wissen.com/articles/ Employee Speak: Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

We are currently seeking an experienced Oracle Data Integrator (ODI) Developer to join our team. The successful candidate will be responsible for the design, development, and maintenance of robust ETL processes and data integration workflows using Oracle Data Integrator. This role requires strong technical expertise in data integration and a solid understanding of database systems, performance optimization, and ETL best practices. Key Responsibilities: Design, develop, and maintain ETL workflows using Oracle Data Integrator (ODI). Create and optimize mappings between source systems and target data environments. Write and tune complex SQL queries for efficient data extraction, transformation, and loading. Design and implement data integration solutions that are scalable and high-performing. Collaborate with cross-functional teams (data analysts, DBAs, application developers) to support data-driven initiatives. Perform unit testing and provide support during system integration testing and user acceptance testing. Troubleshoot and resolve issues in existing ODI integrations. Ensure data quality, integrity, and compliance with internal and external standards. Required Skills and Qualifications: Hands-on experience with Oracle Data Integrator (ODI) 12c (or higher). Strong proficiency in writing and optimizing SQL (Oracle SQL preferred). Solid understanding of ETL design principles and best practices. Experience in designing, developing, and deploying ODI packages, scenarios, procedures, and load plans. Knowledge of data warehousing concepts and dimensional data modeling. Familiarity with Oracle databases, PL/SQL, and data migration strategies. Preferred Qualifications (Nice to Have): Experience with other data integration tools (Informatica, Talend, etc.). Familiarity with cloud platforms (e.g., Oracle Cloud, AWS, Azure) for data integration. Basic knowledge of data governance and security best practices. Additional Information: This is a full-time, remote position, with occasional team collaboration hours aligned to CET. You’ll be working closely with our Data Engineering team, supporting strategic data initiatives across the organization. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Flexera saves customers billions of dollars in wasted technology spend. A pioneer in Hybrid ITAM and FinOps, Flexera provides award-winning, data-oriented SaaS solutions for technology value optimization (TVO), enabling IT, finance, procurement and cloud teams to gain deep insights into cost optimization, compliance and risks for each business service. Flexera One solutions are built on a set of definitive customer, supplier and industry data, powered by our Technology Intelligence Platform, that enables organizations to visualize their Enterprise Technology Blueprint™ in hybrid environments—from on-premises to SaaS to containers to cloud. We’re transforming the software industry. We’re Flexera. With more than 50,000 customers across the world, we’re achieving that goal. But we know we can’t do any of that without our team. Ready to help us re-imagine the industry during a time of substantial growth and ambitious plans? Come and see why we’re consistently recognized by Gartner, Forrester and IDC as a category leader in the marketplace. Learn more at flexera.com Job Summary: We are seeking a skilled and motivated Senior Data Engineer to join our Automation, AI/ML team. In this role, you will work on designing, building, and maintaining data pipelines and infrastructure to support AI/ML initiatives, while contributing to the automation of key processes. This position requires expertise in data engineering, cloud technologies, and database systems, with a strong emphasis on scalability, performance, and innovation. Key Responsibilities: Identify and automate manual processes to improve efficiency and reduce operational overhead. Design, develop, and optimize scalable data pipelines to integrate data from multiple sources, including Oracle and SQL Server databases. Collaborate with data scientists and AI/ML engineers to ensure efficient access to high-quality data for training and inference models. Implement automation solutions for data ingestion, processing, and integration using modern tools and frameworks. Monitor, troubleshoot, and enhance data workflows to ensure performance, reliability, and scalability. Apply advanced data transformation techniques, including ETL/ELT processes, to prepare data for AI/ML use cases. Develop solutions to optimize storage and compute costs while ensuring data security and compliance. Required Skills and Qualifications: Experience in identifying, streamlining, and automating repetitive or manual processes. Proven experience as a Data Engineer, working with large-scale database systems (e.g., Oracle, SQL Server) and cloud platforms (AWS, Azure, Google Cloud). Expertise in building and maintaining data pipelines using tools like Apache Airflow, Talend, or Azure Data Factory. Strong programming skills in Python, Scala, or Java for data processing and automation tasks. Experience with data warehousing technologies such as Snowflake, Redshift, or Azure Synapse. Proficiency in SQL for data extraction, transformation, and analysis. Familiarity with tools such as Databricks, MLflow, or H2O.ai for integrating data engineering with AI/ML workflows. Experience with DevOps practices and tools, such as Jenkins, GitLab CI/CD, Docker, and Kubernetes. Knowledge of AI/ML concepts and their integration into data workflows. Strong problem-solving skills and attention to detail. Preferred Qualifications: Knowledge of security best practices, including data encryption and access control. Familiarity with big data technologies like Hadoop, Spark, or Kafka. Exposure to Databricks for data engineering and advanced analytics workflows. Flexera is proud to be an equal opportunity employer. Qualified applicants will be considered for open roles regardless of age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by local/national laws, policies and/or regulations. Flexera understands the value that results from employing a diverse, equitable, and inclusive workforce. We recognize that equity necessitates acknowledging past exclusion and that inclusion requires intentional effort. Our DEI (Diversity, Equity, and Inclusion) council is the driving force behind our commitment to championing policies and practices that foster a welcoming environment for all. We encourage candidates requiring accommodations to please let us know by emailing careers@flexera.com. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Responsibility of / Expectations from the Role** Design, develop, and deploy ETL workflows using Talend/Informatica to extract, transform, and load data from various sources. Integrate data from different databases, APIs, flat files, and other data sources into the company’s data warehouse. Work with data architects to design scalable data pipelines that support current and future data needs. Perform data quality checks, cleansing, and validation to ensure data accuracy and integrity. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description: Job Summary: SSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in computer science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations. Show more Show less

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Experience : 5 to 8 years Work Modality : Fulltime work from office Job Location : Bengaluru, Karnataka Job Description : Design, develop, and maintain scalable ETL/ELT pipelines. Build and optimize data warehouses, data lakes, and real-time streaming solutions. Collaborate with cross-functional teams to define data requirements and ensure data accuracy and consistency. Develop and maintain database structures and schemas for efficient data storage and retrieval. Optimize data workflows for performance, reliability, and scalability. Implement data security, governance, and compliance best practices. Monitor, troubleshoot, and improve data pipelines to ensure uptime and reliability. Automate data-related processes to improve efficiency and reduce manual intervention. Required Qualifications : Excellent communication skills 5+ years of experience / 3 -4 years of experience as a Data Engineer. Proficiency in SQL and database management (PostgreSQL, MySQL, SQL Server, etc.). Experience with ETL tools such as Pentaho, Talend, Cdata and SSIS Exposure to Python, Java, or Scala for data processing is a plus Experience with big data technologies such as Apache Spark, Hadoop, or Kafka. Familiarity with cloud services (AWS, Azure) and data storage solutions like S3, Redshift, Snowflake, or BigQuery. Knowledge of data modeling, warehousing concepts, and data architecture best practices. Experience working with APIs and data integration. Strong problem-solving skills and ability to work with large, complex datasets.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Template Job Title - Decision Science Practitioner Analyst S&C GN Management Level : Senior Analyst Location: Bangalore/ Kolkata Must have skills: Collibra Data Quality - data profiling, anomaly detection, reconciliation, data validation, Python, SQL Good to have skills: PySpark, Kubernetes, Docker, Git Job Summary: We are seeking a highly skilled and motivated Data Science cum Data Engineer Senior Analyst to lead innovative projects and drive impactful solutions in domains such as Consumer Tech , Enterprise Tech , and Semiconductors . This role combines hands-on technical expertise , and client delivery management to execute cutting-edge projects in data science & data engineering Key Responsibilities Data Science and Engineering Implement and manage end to end Data Quality frameworks using Collibra Data Quality (CDQ). This includes – requirement gathering from the client, code development on SQL, Unit testing, Client demos, User acceptance testing, documentation etc. Work extensively with business users, data analysts, and other stakeholders to understand data quality requirements and business use cases. Develop data validation, profiling, anomaly detection, and reconciliation processes. Write SQL queries for simple to complex data quality checks. Python, and PySpark scripts to support data transformation and data ingestion. Deploy and manage solutions on Kubernetes workloads for scalable execution. Maintain comprehensive technical documentation of Data Quality processes and implemented solutions. Work in an Agile environment, leveraging Jira for sprint planning and task management. Troubleshoot data quality issues and collaborate with engineering teams for resolution. Provide insights for continuous improvement in data governance and quality processes. Build and manage robust data pipelines using Pyspark and Python to read and write from databases such as Vertica and PostgreSQL. Optimize and maintain existing pipelines for performance and reliability. Build custom solutions using Python, including FastAPI applications and plugins for Collibra Data Quality. Oversee the infrastructure of the Collibra application in Kubernetes environment, perform upgrades when required, and troubleshoot and resolve any Kubernetes issues that may affect the application's operation. Deploy and manage solutions, optimize resources for deployments in Kubernetes, including writing YAML files and managing configurations Build and deploy Docker images for various use cases, ensuring efficient and reusable solutions. Collaboration and Training Communicate effectively with stakeholders to align technical implementations with business objectives. Provide training and guidance to stakeholders on Collibra Data Quality usage and help them build and implement data quality rules. Version Control and Documentation Use Git for version control to manage code and collaborate effectively. Document all implementations, including data quality workflows, data pipelines, and deployment processes, ensuring easy reference and knowledge sharing. Database and Data Model Optimization Design and optimize data models for efficient storage and retrieval. Required Qualifications Experience: 4+ years in data science Education: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Industry Knowledge: Preferred experience in Consumer Tech, Enterprise Tech, or Semiconductors but not mandatory Technical Skills Programming: Proficiency in Python, SQL for data analysis and transformation. Tools : Hands-on experience with Collibra Data Quality (CDQ) or similar Data Quality tools (e.g., Informatica DQ, Talend, Great Expectations, Ataccama, etc.). Experience working with Kubernetes workloads. Experience with Agile methodologies and task tracking using Jira. Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication, stakeholder management & requirement gathering capabilities. Additional Information: The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients This position is based at our Bengaluru (preferred) and Kolkata office. About Our Company | Accenture Experience: 4+ years in data science Educational Qualification: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Show more Show less

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Summary We are looking for a strategic, results-driven, and operationally strong Senior Manager, Enterprise Data Platforms (Offshore – India) to lead and manage offshore teams across Architecture, Data Governance & Quality, Engineering, BI Development, Data Science, QA, and Production Support. This role will be responsible for directly managing offshore teams, ensuring seamless execution while maintaining dotted-line alignment with global functional leaders. The Senior Manager will oversee performance management, resource planning, Agile delivery, and team development, ensuring offshore teams contribute effectively to enterprise data initiatives. As part of a matrixed leadership structure, this role will collaborate closely with global data and technology executives to balance hands-on execution with strategic alignment across functions. Key Responsibilities Lead and manage offshore teams across Data Engineering, Architecture, Governance, BI, Data Science, QA, and Production Support, ensuring alignment with global functional leaders. Drive performance management, optimize workload distribution, and ensure high-quality execution and resource efficiency. Oversee workforce planning, hiring, and professional development, fostering a culture of innovation, collaboration, and accountability. Ensure the design, development, and optimization of scalable data solutions, adhering to best practices in architecture, modeling, and performance tuning. Lead modernization efforts, focusing on automation, observability, and scalability of data platforms. Collaborate with global teams to establish standards, frameworks, and reusable components for data engineering. Manage Agile delivery processes, ensuring offshore teams meet expectations and deliver high-impact solutions. Facilitate efficient collaboration between offshore teams and onshore engineering and product teams for seamless execution. Oversee production support, incident resolution, and platform maintenance, ensuring uptime and SLA compliance. Coordinate disaster recovery and business continuity planning for offshore data engineering operations. Act as a technical and operational bridge between offshore teams and global data, analytics, and business leaders. Provide executive-level reporting on engineering team performance, delivery velocity, and platform stability. This is a high-impact leadership role requiring strong operational expertise, strategic thinking, and deep technical acumen to drive enterprise data initiatives forward. What We're Looking For 12+ years overall experience with 4+ years managing offshore data engineering, architecture, or analytics teams in a matrixed environment. Strong people management, execution oversight, and performance accountability. Ability to balance direct leadership with dotted-line functional reporting. Expertise in modern data architecture, scalable pipelines, and cloud platforms (e.g., Snowflake, GCP/Azure, Airflow, dbt, Talend). Experience optimizing data infrastructure, automation, and performance tuning. Understanding of data governance, metadata, and data quality frameworks. Accountable for delivery execution in Agile environments, working with Scrum Masters & Product Owners. Ability to remove roadblocks, optimize team efficiency, and drive execution cadence. Experience managing platform operations, production support, and incident resolution. Strong understanding of SLAs, monitoring, and disaster recovery coordination. Proven ability to engage with global data, technology, and business leaders. Strong communication and executive reporting skills, ensuring transparency in delivery. Why Aristocrat? Aristocrat is a world leader in gaming content and technology, and a top-tier publisher of free-to-play mobile games. We deliver great performance for our B2B customers and bring joy to the lives of the millions of people who love to play our casino and mobile games. And while we focus on fun, we never forget our responsibilities. We strive to lead the way in responsible gameplay, and to lift the bar in company governance, employee wellbeing and sustainability. We’re a diverse business united by shared values and an inspiring mission to bring joy to life through the power of play. We aim to create an environment where individual differences are valued, and all employees have the opportunity to realize their potential. We welcome and encourage applications from all people regardless of age, gender, race, ethnicity, cultural background, disability status or LGBTQ+ identity. EEO M/F/D/V World Leader in Gaming Entertainment Robust benefits package Global career opportunities Our Values All about the Player Talent Unleashed Collective Brilliance Good Business Good Citizen Travel Expectations None Additional Information: Depending on the nature of your role, you may be required to register with the Nevada Gaming Control Board (NGCB) and/or other gaming jurisdictions in which we operate. At this time, we are unable to sponsor work visas for this position. Candidates must be authorized to work in the job posting location for this position on a full-time basis without the need for current or future visa sponsorship. Show more Show less

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Summary This position is responsible for designing highly complex modules, critical components or a whole application/product in its entirety. Has the vision to integrate it across multiple systems. This position works independently and is seen as a technical leader. The position is responsible for driving the design and development efforts related to architecture, scalability, availability and performance in alignment with the product/application roadmap. Job Description Roles and Responsibilities In This Role, You Will Be responsible for providing technical leadership and defining, developing, and evolving software in a fast paced and agile development environment using the latest software development m and infrastructure Provide guidance to developers with either planning and execution and/or design architecture using agile methodologies such as SCRUM Work with Product Line Leaders (PLLs) to understand product requirements & vision Drive increased efficiency across the teams, eliminating duplication, leveraging product and technology reuse Capture system level requirements by brainstorming with CTO, Sr. Architects, Data Scientists, Businesses & Product Managers Leads impact assessment and decision related to technology choices, design /architectural considerations and implementation strategy. Subject matter expert in processes and methodologies with ability to adapt and improvise in various situations. Expert in navigating through ambiguity and prioritizing conflicting asks. Expert level skills in design, architecture and development, with an ability to take a deep dive in the implementation aspects if the situation demands. Leads the architecture and design efforts across the product / multiple product versions and is an expert in architecting custom solutions off the base product. Expert in core data structures as well as algorithms and has the ability to implement them using language of choice when necessary – as a value offering. Education Qualification For Roles Outside USA 12+ year's experience relevant to software development, validation, architecting in industry space. Hands on with application software development in both monolithic and microservice architecture. Basic knowledge of UI/UX tools and development process. Experience in Grid or Energy software business (AEMS / ADMS / Energy Markets / SCADA / GIS) Desired CharacteristicsTechnical Expertise Strong expertise in JAVA and Python, Maven and Spring boot framework Strong experience with Kubernetes and microservices architectures Strong knowledge of Object-Oriented Analysis and Design, Software Design Patterns and Java coding principles . Experience in architecting and designing scalable, distributed systems architecture software products Experience in different architectural styles like SOA , Micro services and Distributed systems architecture. Experience with Data streaming technologies such as Apache Kafka Experience in Enterprise integration patterns with frameworks like Apache Camel, Talend, MuleSoft etc. Define the technical roadmap for the product and define the Design and architectural goals for the team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Experience in managing and influencing senior stakeholders to drive strategic decision-making. Demonstrated ability to develop and execute strategic plans Strategic Vision: Ability to see the bigger picture and align application portfolio strategies with the organization's long-term goals. Innovation: Passion for exploring and adopting innovative technologies to enhance the application landscape. Leadership Excellence: Proven ability to lead and inspire teams, fostering a culture of collaboration and continuous improvement. Strong knowledge of end-to-end SDLC & infrastructure architecture Flexible and adaptable; open to change and modification of innovation strategies agile manner. Facilitates and coaches software engineering team sessions on requirements estimation and alternative approaches to team sizing and estimation. Leads a community of practice around estimation to share best practices among teams Knowledgeable about developments in UX in various contexts, businesses, and industries. Quantifies effectiveness of design choices by gathering data. Drives accountability and adoption. Publishes guidance and documentation to promote adoption of design. Proposes design solutions based on research and synthesis; creates general design principles that capture the vision and critical concerns for a program. Demonstrates mastery of the intricacies of interactions and dynamics in Agile teams. Demonstrates advanced understanding of Lean Six Sigma principles (e.g., Black belt certified). Guides new teams to adopt Agile, troubleshoots adoption efforts, and guide continuous improvement. Provides training on Lean / Agile. Drives elimination of inefficiencies in coding process. Teaches XP practices to others. Actively embraces new methods and practices that increase efficiency and effectiveness. Business Acumen Evaluates technology to drive features and roadmaps. Maps technology trends to internal vision. Differentiates buzzwords from value proposition. Embraces technology trends that drive excellence beyond traditional practices (e.g., Test automation in lieu of traditional QA practices). Balances value propositions for competing stakeholders. Recommends a well-researched recommendation of buy vs. build solution. Conveys the value proposition for the company by assessing financial risks and gains of decisions and return on investment (ROI). Manages the process of building and maintaining a successful alliance. Understands and successfully applies common analytical techniques, including ROI, SWOT, and Gap analyses. Able to clearly articulate the business drivers relevant to a given initiative. Leadership Influences through others; builds direct and "behind the scenes" support for ideas. Pre-emptively sees downstream consequences and effectively tailors influencing strategy to support a positive outcome. Uses experts or other third parties to influence. Able to verbalize what is behind decisions and downstream implications. Continuously reflecting on success and failures to improve performance and decision-making. Understands when change is needed. Participates in technical strategy planning. Proactively identifies and removes project obstacles or barriers on behalf of the team. Able to navigate accountability in a matrixed organization. Communicates and demonstrates a shared sense of purpose. Learns from failure. Personal Attributes Able to effectively direct and mentor others in critical thinking skills. Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Finds important patterns in seemingly unrelated information. Influences and energizes other toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable. Innovates and integrates new processes and/or technology to significantly add value to GE. Identifies how the cost of change weighs against the benefits and advises accordingly. Proactively learns new solutions and processes to address seemingly unanswerable problems. Note Note To comply with US immigration and other legal requirements, it is necessary to specify the minimum number of years' experience required for any role based within the USA. For roles outside of the USA, to ensure compliance with applicable legislation, the JDs should focus on the substantive level of experience required for the role and a minimum number of years should NOT be used. This Job Description is intended to provide a high level guide to the role. However, it is not intended to amend or otherwise restrict/expand the duties required from each individual employee as set out in their respective employment contract and/or as otherwise agreed between an employee and their manager. Additional Information Relocation Assistance Provided: Yes Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Data Architect, you will design and implement scalable, cloud-native data solutions that handle petabyte-scale datasets. You will lead architecture discussions, build robust data pipelines, and work closely with cross-functional teams to deliver enterprise-grade data platforms. Your work will directly support analytics, AI/ML, and real-time data processing needs across global clients. Key Responsibilities Translate complex data and analytics requirements into scalable technical architectures. Design and implement cloud-native architectures for real-time and batch data processing. Build and maintain large-scale data pipelines and frameworks using modern orchestration tools (e.g., Airflow, Oozie). Define strategies for data modeling, integration, metadata management, and governance. Optimize data systems for cost-efficiency, performance, and scalability. Leverage cloud services (AWS, Azure, GCP) including Azure Synapse, AWS Redshift, BigQuery, etc. Implement data governance frameworks covering quality, lineage, cataloging, and access control. Work with modern big data technologies (e.g., Spark, Kafka, Databricks, Snowflake, Hadoop). Collaborate with data engineers, analysts, DevOps, and business stakeholders. Evaluate and adopt emerging technologies to improve data architecture. Provide architectural guidance in cloud migration and modernization projects. Lead and mentor engineering teams and provide technical thought Skills and Experience : Bachelor's or Masters in Computer Science, Engineering, or related field. 10+ years of experience in data architecture, engineering, or platform roles. 5+ years of experience with cloud data platforms (Azure, AWS, or GCP). Proven experience building scalable enterprise data platforms (data lakes/warehouses). Strong expertise in distributed computing, data modeling, and pipeline optimization. Proficiency in SQL and NoSQL databases (e.g., Snowflake, SQL Server, Cosmos DB, DynamoDB). Experience with data integration tools like Azure Data Factory, Talend, or Informatica. Hands-on experience with real-time streaming technologies (Kafka, Kinesis, Event Hub). Expertise in scripting/programming languages such as Python, Spark, Java, or Scala. Deep understanding of data governance, security, and regulatory compliance (GDPR, HIPAA, CCPA). Strong communication, presentation, and stakeholder management skills. Ability to lead multiple projects simultaneously in an agile environment. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Department: SE DC DG A Snapshot of Your Day We are seeking a highly skilled and committed Manager to lead our data accountability, data security, data regulatory compliance, data access and Artificial Intelligence (AI) governance workstreams. This position requires an initiative-taking leader with a robust background in data governance, data ownership, data security and compliance, who through their team can drive the development and implementation of comprehensive strategies to support our data roles, secure our data assets and uphold ethical AI practices to maintain data integrity and increase a culture of data accountability. How You’ll Make An Impact Data Accountability: Develop and implement policies and procedures to improve data accountability across the organization. the Data Roles & Responsibilities Framework and Help in its operationalization in Siemens Energy. Lead Enablement and Training Programs for the data roles. Support the build of Data communities in the business areas and bring them closer to our formally established data roles. Data Security: Lead the development and execution of comprehensive data security and data access management strategies, policies and procedures which are essential for protecting an organization's data assets, mitigating risks, ensuring compliance, and maintaining collaborator trust. Data Retention: Lead development of retention framework and secure disposal methods to retain data only as vital and dispose of it securely. AI Governance: Lead AI Governance team working on establishing policies and processes to ensure ethical, responsible, and compliant use of AI. Work closely with multi-functional teams of Business, Data domains, Cyber security, Legal and Compliance, Artificial Intelligence, Applications teams etc. Manage and mentor a team of data professionals, providing guidance, training, and support to achieve departmental goals. Develop and implement strategic goals for the team, aligned with organizational objectives. Partner Engagement: Collaborate with collaborators across the organization to align strategies with business objectives. Innovation: Stay abreast of industry trends and emerging technologies and incorporate innovative solutions to enhance data accountability, data security AI governance What You Bring Bachelor’s degree in computer science, information technology, data science, or a related field; master’s degree preferred. Minimum of 8 years of experience in data management, data governance, data ownership and data security governance roles in large scale enterprise set up with at least 3 years in a leadership capacity. Demonstrable ability to develop & implement data governance frameworks. Excellent leadership, communication, and interpersonal skills. Strong analytical, problem-solving, and decision-making abilities. Ability to work effectively in a fast-paced, dynamic environment and manage multiple priorities. Solid understanding of data accountability or data ownership, data security and compliance principles and practices. Familiarity with data governance tools like Collibra, Informatica, or Ataccama, or Talend. Experience with data modeling, data architecture, and database management systems is preferred. About The Team Our Corporate and Global Functions are essential in driving the company's pivotal initiatives and ensuring operational excellence across various groups, business areas, and regions. These roles support our vision to become the most valued energy technology company in the world. As part of our team, you contribute to our vision by shaping the global energy transition, partnering with our internal and external collaborators, and conducting business responsibly and in compliance with legal requirements and regulations. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With ~100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world's electricity generation. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character – no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits Opportunities to work with a distributed team Opportunities to work on and lead a variety of innovative projects Medical benefits Time off/Paid holidays and parental leave Continual learning through the Learn@Siemens-Energy platform https://jobs.siemens-energy.com/jobs Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Manager, Business Analyst – C1 Employment Type: Permanent Location: Chennai Responsible Functions Gen AI: Expertise in leveraging advanced AI technologies to analyze business processes, identify automation and optimization opportunities, and drive data-driven decision-making. Capability to collaborate with stakeholders to translate business needs into AI solutions, ensuring seamless integration and maximizing operational efficiency, productivity, and innovation. Product Vision & Strategy: Perform market analysis to understand market landscape including competitor analysis, trends and customer needs to help define and communicate product vision & strategy aligning with company objectives Stakeholder Engagement: Interact with diversified stakeholders to conduct JAD sessions and use variety of techniques to elicit, elaborate, analyze and validate client requirements. Interact with Business team to conduct product demonstrations, evaluate, prioritize and build new features and functions. Requirements Management: Analyze and develop Business Requirements Document (BRD) and Functional Specification Document (FSD) for client/business reference. Translate Business Requirements to User Stories, Prioritize the Backlog and conduct Scrum Ceremonies for development consumption. Functional Solution Development: Responsible for End-to-End Functional Solutioning. Analyze the Business problem and validate the key business requirements to create a complete picture of workflows and technical requirements fulfilled by existing and proposed software. Identify, define and evaluate potential product solutions, including off-the-shelf and open-source components, and system architecture to ensure that they meet business requirements. Communication & Collaboration: Be a strong interface between Business and Internal stakeholders. Collaborate with development team (including architecture, coding & testing teams) to produce/maintain additional product and project deliverables like technical design, testing & program specifications, additional test scenarios and project plan. Proactively manage expectation regarding roadblocks, in the critical path to help ensure successful delivery of the solution. Business Value: Comprehend the business value of the fundamental solution being developed, assess the fitment within the overall architecture, risks and technical feasibility. Drive business metrics that will help optimize business & also deep dive into data for insights as required Team Management: Manage a small team of Business Analysts, define clear goals and be accountable for the functional solution delivered by the team. Participate in recruitment and building a strong BA team. RFP Support: Participate in Request for Information/Proposal handling and support with responses & solutions to questions or information requested. Client/Business Training: Work with Technical Writers to create training material and handle product/platform training sessions with diversified stakeholders. Essential Functions Multi-disciplinary technologist who enjoys designing, executing and selling Healthcare solutions, and being on the front-line of client communications and selling strategies Deep understanding of the US Healthcare value chain and key impact drivers [Payer and/or Provider] Knowledgeable and cognizant of how data management and science is used to solve organizational problems in the healthcare context Hands-on experience in two (or more) areas of the data and analytics technical domains - Enterprise cloud data warehousing, integration, preparation, and visualization along with artificial intelligence, machine learning, data science, data modeling, data management, and data governance Strong problem solving and analytical skills: Ability to break down a vague business problem into structured data analysis approaches & ability to work with incomplete information and take judgment-driven decisions based on experience. Experience ramping up analytics programs with new clients, including integrating with work of other teams to ensure analytics approach is aligned with operations as well as engage in consultative selling Primary Internal Interactions Review with the Product Manager & AVP for improvements in the product development lifecycle Assessment meeting with VP & above for additional product development features. Manage a small team of business analyst to lead the requirements effort for product development Primary External Interactions Communicate with onshore stakeholder & Executive Team Members. Help the Product Management Group set the product roadmap & help in identifying future sellable product features. Client Interactions to better understands expectations & streamline solutions. If required should be a bridge between the client and the technology teams. Skills Technical Skills Required Skills - SME in US Healthcare with deep Knowledge on Claims & Payments Lifecycle with at least 8 years of experience working with various US Healthcare Payer clients Skills Must Have Excellent understanding of Software Development Life Cycle & Methodologies like Agile Scrum, Waterfall etc. Strong experience in requirements elicitation techniques, functional documentation, stakeholder management, business solutions validation and user walkthroughs. Strong documentation skills to create BRD, FSD, Process Flows, User Stories Strong presentation skills. Good knowledge of SQL. Knowledge of tools like Azure Dev Ops, Jira, Visio, Draw.io etc. Experience in AI or Gen AI projects. Skills Nice To Have Development experience of 2 or more years will be good to have Experience on Big Data Tools – not limited to – Python, Spark + Python, HIVE, HBASE, Sqoop, CouchDB, MongoDB, MS SQL, Cassandra, Kafka Knowledge of Data Analysis Tools – (Online analytical processing (OLAP), ETL frameworks) Knowledge of Enterprise modeling tool and data integration platform (Erwin, Embarcadero, Informatica, Talend, SSIS, DataStage, Pentaho) Knowledge of Enterprise business intelligence platform (Tableau, PowerBI, Business Objects, Microstrategy, Cognos) Knowledge of Enterprise data warehousing platform (Oracle, Microsoft, DB2, Snowflake, AWS, Azure, Google Cloud Platform) Process Specific Skills Delivery Domain - Software Development – SDLC & Agile Certifications Business Domain - US Healthcare & Payer Analytics Payment Integrity Fraud, Waste & Abuse Claims Management Soft Skills Understanding of Healthcare business vertical and the business terms within Good analytical skills. Strong communication skills - oral and written Ability to work with various stakeholders across different geographical locations Should be able to function as an Individual Contributor as well if required. Strong aptitude to learn & implement healthcare solutions. Good Leadership Skills. Working Hours General Shift – 12PM to 9 PM Will be required to extend as per project release needs Education Requirements Master’s or bachelor’s degree from top tier colleges with good grades, preferably in a relevant field including Mathematics, Statistics, Computer Science or equivalent experience Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Senior Business Analyst/Lead Business Analyst – B1/B2 Employment Type: Permanent Location: Chennai Responsible Functions Product Vision & Strategy: Help with inputs on product features through market analysis to understand market landscape including competitor solutions, trends and customer needs. Stakeholder Engagement: Interact with diversified stakeholders to conduct JAD sessions and use variety of techniques to elicit, document, analyze and validate client requirements. Interface with Business team to conduct product demonstrations, evaluate, prioritize and build new features and functions. Requirements Management: Analyze and develop business requirement document (BRD) for client/business reference. Translate Business Requirements to user Stories to create, prioritize in backlog, sprint, DOD and releases using Jira for development consumption. Perform requirements review with external and internal stakeholders and resolve issues while suggesting corrective actions. Functional Solution Development: Responsible for end-to-end functional solution. Analyze the business problem and validate the key business requirements to create a complete picture of workflows and technical requirements fulfilled by existing and proposed software. Identify, define and evaluate potential product solutions, including off-the-shelf and open-source components, and system architecture to ensure that they meet business requirements. Communication & Collaboration: Act as a liaison between Business user and technical solutions/support groups to ensure proper communication between diversified teams. Collaborate with development team (including architecture, coding & testing teams) to produce/maintain additional product and project deliverables in technical design, testing & program specifications, additional test scenarios and project plan. Proactively manage expectation regarding roadblocks, in the critical path to help ensure successful delivery of the solution. Business Value: Comprehend the fundamental solution being developed/deployed – its business value & blueprint how it fits with the overall architecture, risks, and more. Drive business metrics that will help optimize business & also deep dive into data for insights as required Team Mentoring: Train & Mentor juniors in the team on a need basis Essential Functions Technologist who enjoys executing and selling Healthcare solutions. Being on the front-line of client communications. Good understanding of the US Healthcare value chain and key impact drivers [Payer and/or Provider] Knowledgeable and cognizant of how data management and science is used to solve organizational problems in the healthcare context Hands-on experience in at least one area of the data and analytics technical domains - Enterprise cloud data warehousing, integration, preparation, and visualization along with artificial intelligence, machine learning, data science, data modeling, data management, and data governance Strong problem solving and analytical skills: ability to break down a vague business problem into structured data analysis approaches & ability to work with incomplete information and take judgment-driven decisions based on experience. Primary Internal Interactions Review with the Product Manager & AVP for improvements in the product development lifecycle Assessment meeting with VP & above for additional product development features. Train & Mentor juniors in the team on a need basis. Primary External Interactions Communicate with onshore stakeholder & Executive Team Members. Help the Product Management Group set the product roadmap & help in identifying future sellable product features. Client Interactions to better understands expectations & streamline solutions. If required should be a bridge between the client and the technology teams. Skills Technical Skills Required Skills – Good Knowledge of US Healthcare with at least 3 years of experience working with various US Healthcare Payer clients Skills Must Have Good understanding of Software Development Life Cycle & Methodologies like Agile Scrum, Waterfall etc. Strong experience in requirements elicitation techniques, functional documentation, stakeholder management, business solutions validation and user walkthroughs. Strong documentation skills to create BRD, FSD, Process Flows, User Stories Strong presentation skills. Basic knowledge of SQL. Knowledge of tools like Jira, Visio, Draw.io etc. Skills Nice To Have Development experience of 1 or 2 years will be good to have Experience on Big Data Tools – not limited to – Python, Spark + Python, HIVE, HBASE, Sqoop, CouchDB, MongoDB, MS SQL, Cassandra, Kafka Knowledge of Data Analysis Tools – (Online analytical processing (OLAP), ETL frameworks) Knowledge of Enterprise modeling tool and data integration platform (Erwin, Embarcadero, Informatica, Talend, SSIS, DataStage, Pentaho) Knowledge of Enterprise business intelligence platform (Tableau, PowerBI, Business Objects, Microstrategy, Cognos) Knowledge of Enterprise data warehousing platform (Oracle, Microsoft, DB2, Snowflake, AWS, Azure, Google Cloud Platform) Process Specific Skills Delivery Domain - Software Development – SDLC & Agile Certifications Business Domain - US Healthcare Insurance & Payer Analytics Fraud, Waste & Abuse Payer Management Code Classification Management Soft Skills Understanding of Healthcare business vertical and the business terms within Good analytical skills. Strong communication skills - oral and verbal Ability to work with various stakeholders across different geographical locations Should be able to function as an Individual Contributor as well if required. Strong aptitude to learn & implement healthcare solutions. Ability to work independently Working Hours General Shift – 12PM to 9 PM Will be required to extend as per project release needs Education Requirements Master’s or Bachelor’s degree from top tier colleges with good grades, preferably in a relevant field including Mathematics, Statistics, Computer Science or equivalent experience Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Department : SE DC DG A Snapshot of Your Day We are looking for a highly skilled and experienced manager to oversee our Enterprise data quality, Master Data Management & Reference Data Management workstreams. This role will ensure the development and implementation of comprehensive strategies for Siemens Energy´s data to be well maintained, accurate, and reliable through effective data quality and data management. The ideal candidate will possess strong leadership skills, a deep understanding of data governance principles, and the ability to drive data quality initiatives across the organization. This position will require close collaboration with various groups to identify, address, and resolve data-related issues while establishing robust data quality and data management frameworks. How You’ll Make An Impact Lead and mentor a team of data management professionals, providing guidance and fostering a collaborative environment. Develop and implement strategic goals for the team, aligned with organizational objectives. Ensure continuous professional development and performance management of team members. Offer strategic guidance and assistance to different data domains in Siemens Energy and collaborate effectively to enhance the implementation of the data governance framework. Oversee the establishment and enforcement of data quality standards and procedures. Support implementation of data quality metrics and reporting mechanisms to monitor and improve data accuracy and consistency across domains. Collaborate with multi-functional teams to identify data quality issues and their impact on business processes. Drive and support the design and delivery of the projects to address data quality issues. Provide thought leadership and industry standard methodologies on the right architecture, tool and governance model. Work on new ideas like using Data Quality for Artificial Intelligence, designing value Quantification approaches, data monetization opportunities etc. Develop and manage the organization's master data & reference data strategy, standards and processes to ensure consistency, accuracy, and reliability of master data across all business systems and support the integration and synchronization of reference data across various platforms. Work with business to standardize and harmonize master data definitions and processes and identify and document master data and reference data entities. Provide thought leadership on the right tools and approaches needed for data management in SE to make us future proof. What You Bring Bachelor’s degree in information technology, Data Science, Business Administration, or a related field (Master’s degree preferred). Minimum of 8 years of experience in data management and data quality roles, in large scale enterprise set up with at least 3 years in a leadership capacity. Proven track record of developing and implementing enterprise data quality, data management strategies and initiatives. Excellent leadership, communication, and interpersonal skills. Strong analytical, problem-solving, and decision-making abilities. Ability to work effectively in a fast-paced, dynamic environment and manage multiple priorities. Solid understanding of data quality, master data, and reference data management principles and practices. Proficiency with data management tools and technologies, such as SQL, ETL tools, data governance platforms, and MDM solutions like Collibra, Informatica, or Ataccama, or Talend. Experience with data modeling, data architecture, and database management systems is preferred. About The Team Our Corporate and Global Functions are essential in driving the company's pivotal initiatives and ensuring operational excellence across various groups, business areas, and regions. These roles support our vision to become the most valued energy technology company in the world. As part of our team, you contribute to our vision by shaping the global energy transition, partnering with our internal and external collaborators, and conducting business responsibly and in compliance with legal requirements and regulations. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With ~100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world's electricity generation. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character – no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits Opportunities to work with a distributed team Opportunities to work on and lead a variety of innovative projects Medical benefits Time off/Paid holidays and parental leave Continual learning through the Learn@Siemens-Energy platform https://jobs.siemens-energy.com/jobs Show more Show less

Posted 3 weeks ago

Apply

2.0 - 7.0 years

8 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

data engineering or ETL development Talend, IBM DataStage, or other ETL tools PostgreSQL, SQL Server, Oracle Git AWS, Azure, or GCP Knowledge of data lakes and big data frameworks scripting (e.g., Python, Shell) for automation Agile/Scrum

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Data Quality (DQ) Analyst Location: India Experience: Minimum 3-5 years Job Description We are hiring Data Quality Analysts to support AIG’s data initiatives, with a specific focus on building and maintaining DQ rules across the insurance value chain. The roles are India-based and will play a key part in improving data accuracy, consistency, and reliability in the underwriting and claims domains. Key Responsibilities Develop and implement data quality rules and validation logic across insurance datasets. Ensure data integrity across key areas, with a strong focus on claims and underwriting processes. Collaborate with business analysts, data stewards, and IT teams to define DQ requirements and resolve data issues. Perform root cause analysis on data quality issues and drive corrective actions. Document DQ rules, processes, and metadata for transparency and audit readiness. Required Skills: Data profiling, Data quality rule creation, Data validation, Root cause analysis, Insurance value chain, Claims, Underwriting, Strong SQL proficiency, ETL concepts, Data warehousing, Problem-solving, Attention to detail, Collaboration with cross-functional teams, Informatica Data Quality, Talend, Ataccama, Offshore/India delivery model. Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Key Responsibilities: Develop And Maintain Web Applications Build dynamic, user-centric web applications using React, React Hooks, and modern web technologies like HTML5 and CSS3. Ensure that the application is scalable, maintainable, and easy to navigate for end-users. Collaborate With Cross-Functional Teams Work closely with designers and product teams to bring UI/UX designs to life, ensuring the design vision is executed effectively using HTML and CSS. Ensure the application is responsive, performing optimally across all devices and browsers. State Management Utilize Redux to manage and streamline complex application states, ensuring seamless data flow and smooth user interactions. Component Development Develop reusable, modular, and maintainable React components using React Hooks and CSS/SCSS to style components effectively. Build component libraries and implement best practices to ensure code maintainability and reusability. Role Proficiency This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments "Frontend developer Required Skills and Experience: React.js Proficiency: - In-depth knowledge of React.js, JSX, React Hooks, and React Router. - Experience with state management using Redux or similar libraries. - Familiar with React performance optimization techniques, including lazy loading, memoization, and code splitting. - Experience with tools like react-testing-library, NPM (vite, Yup, Formik). CSS Expertise: - Strong proficiency in CSS, including the use of third-party frameworks like Material-UI (MUI) and Tailwind CSS for styling. - Ability to create responsive, visually appealing layouts with modern CSS practices. JavaScript/ES6+ Expertise: - Strong command of modern JavaScript (ES6+), including async/await, destructuring, and class-based components. - Familiarity with other JavaScript frameworks and libraries such as TypeScript is a bonus. Version Control: - Proficient with Git and platforms like GitHub or GitLab, including managing pull requests and version control workflows. API Integration: - Experienced in interacting with RESTful APIs. - Understanding of authentication mechanisms like JWT. Testing: - Able to write unit, integration, and end-to-end tests using tools such as react-testing-library. ------------------------------------------------------------------------------------------------------------------- Basic Qualifications: - At least 3 years of experience working with JavaScript frameworks, particularly React.js. - Experience working in cloud environments (AWS, Azure, Google Cloud) is a plus. - Basic understanding of backend technologies such as Python is advantageous." Skills Cloud Services,Backend Systems,Css Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

data engineering, data management, Snowflake , SQL, data modeling, and cloud-native data architecture AWS, Azure, or Google Cloud (Snowflake on cloud platforms) ETL tools such as Informatica, Talend, or dbt Python or Shell scripting

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Lead Data Engineer – C12 / Assistant Vice President (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 8 to 12 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 3 weeks ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Hyderabad

Hybrid

Naukri logo

Design, build and maintain complex ELT/ETL jobs that deliver business value. Extract, transform and load data from various sources including databases, APIs, and flat files using IICS or Python/SQL. Translate high-level business requirements into technical specs Conduct unit testing, integration testing, and system testing of data integration solutions to ensure accuracy and quality Ingest data from disparate sources into the data lake and data warehouse Cleanse and enrich data and apply adequate data quality controls Provide technical expertise and guidance to team members on Informatica IICS/IDMC and data engineering best practices to guide the future development of companys Data Platform. Develop re-usable tools to help streamline the delivery of new projects Collaborate closely with other developers and provide mentorship Evaluate and recommend tools, technologies, processes and reference Architectures Work in an Agile development environment, attending daily stand-up meetings and delivering incremental improvements Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications and provide feedback on code quality, design, and performance

Posted 3 weeks ago

Apply

3.0 - 7.0 years

10 - 18 Lacs

Pune

Work from Office

Naukri logo

What Ibexlabs Does Ibexlabs is an AWS Advanced Tier Consulting Partner with multiple competencies, including Security, DevOps, Healthcare, and Managed Services. Our team of dedicated and highly skilled engineers is passionate about helping customers accelerate their cloud transformation while ensuring security and compliance with industry best practices. As a rapidly growing company, we are seeking talented individuals to join us and contribute to our continued success. Position Details Job Purpose We are seeking a skilled Talend Developer to design, develop, and implement ETL processes using Talend Data Integration tools. The ideal candidate will have hands-on experience working with large datasets, building scalable data pipelines, and ensuring data integrity and quality across complex systems. Responsibilities and Requirements Design, develop, and maintain ETL processes using Talend (Open Studio / Talend Data Fabric) . Collaborate with data architects, analysts, and business stakeholders to gather and understand data requirements. Extract data from a variety of sources (databases, flat files, APIs, cloud platforms) and load into target systems (data warehouses, Snowflake, etc.). Optimize and troubleshoot ETL workflows for performance and reliability. Implement data quality checks, transformations, and validations. Monitor daily ETL processes and proactively resolve issues. Create and maintain technical documentation related to ETL processes and data pipelines. Ensure compliance with data governance, security, and privacy policies. Support data migration and integration projects from legacy systems to modern platforms. Skills and Qualifications: Bachelors degree in Computer Science, Information Systems, or related field. 3+ years of hands-on experience with Talend (preferably Talend Data Integration or Talend Big Data). Strong knowledge of ETL concepts, data modeling , and data warehousing . Proficient in SQL and working with relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL). Experience with cloud data platforms such as Snowflake, AWS, Azure , or GCP is a plus. Familiarity with big data technologies (Hadoop, Spark) is a plus. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Why should you be interested in this opportunity? Your freedom and opportunity to grow rapidly in your career. You will be fully empowered by tools and knowledge to grow in your career as well as helping your team members grow. A culture of respect, humility, growth mindset, and fun in the team. Get rewarded and recognized for your work and effort. Training and career development benefits. Life Insurance, paid parental leave and vacation days.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description For Senior Data Engineer Objectives and Purpose The Senior Data Engineer ingests, builds, and supports large-scale data architectures that serve multiple downstream systems and business users. This individual supports the Data Engineer Leads and partners with Visualization on data quality and troubleshooting needs. The Senior Data Engineer will: Clean, aggregate, and organize data from disparate sources and transfer it to data warehouses. Support development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Essential Skillsets Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Desired Skillsets Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS Certified Data Engineer - Associate EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies