Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role We are looking for a SQL Expert with around 5 years of hands-on experience in designing, developing, and optimizing complex SQL queries, stored procedures, and database solutions. You will play a key role in supporting our data-driven applications, ensuring efficient data processing, performance tuning, and robust database design. Responsibilities This is a critical role working alongside product, engineering, and analytics teams to deliver high-quality, reliable, and scalable data Responsibilities : Design and develop complex SQL queries, stored procedures, views, and functions. Optimize query performance, indexing strategies, and database tuning. Develop and maintain ETL/ELT pipelines for data processing and transformation. Collaborate with developers to design scalable and normalized database schemas. Analyze and troubleshoot database performance issues and recommend improvements. Ensure data integrity, consistency, and compliance across systems. Create and maintain comprehensive documentation of data models and processes. Support reporting and analytics teams by providing clean, optimized datasets. Work with large datasets and understand partitioning, sharding, and parallel Skills & Experience : 5+ years of hands-on experience with SQL (SQL Server, MySQL, PostgreSQL, Oracle, or similar). Strong knowledge of advanced SQL concepts : window functions, CTEs, indexing, query optimization. Experience in writing and optimizing stored procedures, triggers, and functions. Familiarity with data warehousing concepts, dimensional modeling, and ETL processes. Ability to diagnose and resolve database performance issues. Experience working with large, complex datasets and ensuring high performance. Strong understanding of relational database design and normalization. Solid experience with tools like SSIS, Talend, Apache Airflow, or similar ETL frameworks (nice to have). Familiarity with cloud databases (AWS RDS, BigQuery, Snowflake, Azure SQL) is a plus. Good communication and documentation Qualifications : Experience with BI/reporting tools (Power BI, Tableau, Looker). Knowledge of scripting languages (Python, Bash) for data manipulation. Understanding of NoSQL databases or hybrid data architectures. (ref:hirist.tech) Show more Show less
Posted 2 weeks ago
5.0 - 7.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation for SnapLogic integrations and workflows. Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: Must To Have Skills:Strong experience in SnapLogic. Good To Have Skills:Experience in other ETL tools like Informatica, Talend, or DataStage. Experience in designing, developing, and maintaining integrations and workflows using SnapLogic. Experience in analyzing business requirements and developing solutions to meet those requirements. Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information: The candidate should have a minimum of 5 years of experience in SnapLogic. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic. This position is based at our Pune office. Qualifications 15 Years of full time education
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Dassault Systemes 3DEXPERIENCE ENOVIA Customization Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : BE or BTech Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Dassault Systemes 3DEXPERIENCE ENOVIA Customization. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing solutions to meet those requirements. Key responsibilities----1. ETL tool understanding and working knowledge. Talend and Informatica. Or Azure Data Factory/ SQL Server Integration Services.2. Building data validation scripts3. Testing the data extract4. Working with the Data Migration Lead on any other aspects of the data migration work stream. Technical Experience-1. 4+ years of experience in 3DExperience Enovia and xPDM customization and configuration on 3DEXPERIENCE 2019x or above .2. Should have work experience to manage Enovia xPDM integration process and Manage monitoring integration setup xPDM Gateway 3. Exposure to 3DEXPERIENCE webservices.Professional Experience-1. Outstanding all-round communication skills and ability to work collaboratively . Qualifications BE or BTech
Posted 2 weeks ago
5.0 - 9.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Kinaxis Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Key ResponsibilitiesWe are looking for talented people with entrepreneurial drive to join our growing team of Integration Consultant.Integration Consultant's responsibility to review Integration of all integrated application. Must ensure all requirements of functional, security, integration, performance, quality, and operations requirements are met. Review and integrate the technical architecture requirements. Technical Experience (Must Have) 5+ years of experience in Integration modelling, Integration design and Integration analysis in Kinaxis Experienced in Integration and solution design and configuration in Kinaxis (either of DP, SP, IBP, MEIO, CT) Experienced with Integrationbase technologies SQL, NOSQL, Oracle Experienced working with REST/SOAP APIs, SSIS, Swagger Files creating web services Experienced with ETL and Integration integration platforms like Talend, Dell Boomi, Mulesoft, Informatica ICS, Mavenlink M-Bridge Excellent Integration management experience with the design, development and maintenance of enterprise level Integration systemsTechnical Experience (Good to Have) Strong design, analytical and problem-solving skills. Good to have completed Kinaxis certifications viz. Contributor Level 1, Author Level 1 and 2, Automated Algorithms. Ability to analyze large amount of Integration:R, Python, SQL knowledge Functional and system architecture knowledge of SAP ECC, APO and/or IBP implementationProfessional Attributes Excellent communication and interpersonal skills to interact with internal and external stakeholders with a capacity to present, discuss and explain issues coherently and logically both in writing and verbally Hands-on experience to configure the supply chain product and customize the solution as per clients Ability to create business blueprint documents and validate design by conforming to best industry practice Good influencing and persuasion skills with the ability to enthuse and inspire multidisciplinary teams and build successful relationships at all levels Clear decision-making ability with the facility to judge complex situations and assess when to escalate issues Ability to balance conflicting and changing demands through prioritization and a pragmatic approach Educational QualificationBTech/BE/MCA, - University degree, preferable in IT & Supply Chain Business school or industrial engineering schoolAdditional InformationOpen to travel - short / long term Qualifications 15 years full time education
Posted 2 weeks ago
4.0 - 5.0 years
6 - 7 Lacs
Karnataka
Work from Office
Develops and manages Oracle data solutions, integrating warehouse management, OBIEE, and ODI for business intelligence.
Posted 2 weeks ago
4.0 - 5.0 years
6 - 7 Lacs
Karnataka
Work from Office
Develop and manage ETL pipelines using Python. Responsible for transforming and loading data efficiently from source to destination systems, ensuring clean and accurate data.
Posted 2 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Kolkata
Work from Office
The Talend Open Studio role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Talend Open Studio domain.
Posted 2 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Mumbai
Work from Office
The Talend Open Studio role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Talend Open Studio domain.
Posted 2 weeks ago
3.0 - 8.0 years
14 - 24 Lacs
Pune
Hybrid
Job responsibilities: Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, Informatica Power center, Talend, Big Data, Hive Need combination of skills for below stack : SQL, Informatica Power center, Talend, Big Data(Hadoop ecosystem), Hive
Posted 2 weeks ago
4.0 - 9.0 years
13 - 23 Lacs
Pune
Work from Office
Job Title: Data Engineer - SAS DI Location: Pune, Maharashtra, India Experience: 4+ years Job Type: Full-time Hybrid Shift : 11AM -8 PM Mon-Friday Data Engineer What We're Looking For: Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, SAS DI , Informatica Power center, Talend , Big Data, Hive
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
The IBM InfoSphere DataStage, Teradata role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage, Teradata domain.
Posted 2 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Chennai
Work from Office
The Talend Open Studio role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Talend Open Studio domain.
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend Big Data Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend Big Data.- Strong understanding of data integration processes and ETL methodologies.- Experience with big data technologies such as Hadoop and Spark.- Familiarity with cloud platforms and services related to data processing.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Talend Big Data.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. Key Responsibilities Design and implement data quality frameworks and metrics for large-scale datasets, especially those used in AI/ML models. Skill Set: Talend or Informatica ETL tool, Sql, Python ,Power BI, oracle, Databricks, Informatica Data quality tool Collaborate with data scientists and machine learning engineers to validate training data and ensure its integrity. Develop automated data validation scripts, profiling tools, and dashboards to monitor data quality in real time. Investigate and resolve data anomalies, inconsistencies, and quality issues at the source. Lead root cause analysis efforts and recommend long-term solutions for recurring data quality issues. Document and maintain data dictionaries, lineage, and quality rules for AI-relevant data assets. Contribute to the creation and enforcement of data governance policies and best practices. Partner with stakeholders to understand data requirements and translate them into quality specifications. Stay up to date with trends in data quality, AI ethics, and data governance, and bring innovative practices to the team. Experience range - 5 to 8 Yrs, with a Bachelor/Master's degree We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, national origin, status as a veteran, and basis of disability or any federal, state, or local protected class. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for an experienced Scrum Master to lead Agile practices for our Finance Domain ETL/Talend team at the client location. The ideal candidate will facilitate sprints, remove blockers, and ensure seamless collaboration between business and technical teams. Key Responsibilities Guide the team in Agile principles and Scrum framework. Facilitate sprint planning, stand-ups, reviews, and retrospectives. Collaborate with stakeholders to ensure smooth project execution. Identify and resolve impediments for the team. Track progress and ensure timely delivery of ETL/Talend-based solutions. Required Skills Strong experience as a Scrum Master in Finance Domain projects. Hands-on knowledge of ETL/Talend processes and data pipelines. Excellent stakeholder management and communication skills. Certified Scrum Master (CSM) or equivalent is preferred. Show more Show less
Posted 2 weeks ago
6.0 years
5 - 12 Lacs
India
On-site
ob Title: Senior Talend Developer with Power Apps Experience Location: Hyderabad, India Work Mode: Work from Office (WFO) only Experience: 6+ Years (Real-time/Hands-on) Job Summary: We are looking for a highly skilled Talend Developer with over 6 years of real-time experience, including strong hands-on exposure to Microsoft Power Apps . The ideal candidate will be responsible for designing, developing, and implementing scalable data integration solutions using Talend and building user-centric applications with Power Apps to support business processes. Key Responsibilities: Design and develop robust ETL pipelines using Talend (Open Studio / Talend DI / Talend Cloud) . Integrate data from various sources like SQL Server, Oracle, SAP, REST APIs, Flat Files , etc. Develop and maintain Power Apps solutions (Canvas & Model-driven apps) that connect with enterprise data sources. Work closely with business stakeholders to gather requirements and deliver scalable solutions. Collaborate with other developers, architects, and QA to ensure data integrity, performance, and quality. Perform unit testing, deployment, and documentation for all developments. Ensure best practices in data governance, performance optimization , and data quality . Troubleshoot and resolve issues in existing Talend jobs and Power Apps flows. Required Skills: Minimum 6 years of real-time hands-on experience with Talend ETL tools . Proficiency in Microsoft Power Apps (Canvas & Model-driven) , Power Automate, and connectors. Good understanding of data warehousing concepts , data modeling , and SQL . Experience with REST/SOAP APIs , JSON, and XML data integration. Familiarity with Azure Data Services is a plus. Strong debugging, performance tuning, and problem-solving skills. Excellent communication and documentation abilities. Nice to Have: Experience with Power BI , SharePoint , or other Power Platform tools. Familiarity with Agile/Scrum methodologies . Prior experience in BFSI, Manufacturing, or Retail domains . Job Types: Full-time, Permanent Pay: ₹540,338.38 - ₹1,244,146.65 per year Schedule: Day shift Work Location: In person
Posted 2 weeks ago
0 years
4 - 7 Lacs
Chennai
On-site
Job Title: Senior Data Quality Assurance Analyst Career Level :: D Introduction to role: Are you ready to play a pivotal role in transforming data quality within a global pharmaceutical environment? Join our Operations Global Data Office as a Senior Data Quality Assurance Analyst, where you'll be instrumental in developing solutions to enhance data quality within our SAP systems. Your expertise will drive the creation of data quality rules and dashboards, ensuring alignment to standards and governance policies. Collaborate with data stewards and business owners to develop code for monitoring and measuring data quality, focusing on root cause analysis and prevention of data issues. Are you solution-oriented and passionate about testing? This is your chance to make a significant impact! Accountabilities: Develop and support the creation of data quality dashboards in Power BI by extracting data from various Global SAP systems into Snowflake. Work extensively with collaborators to define requirements for continuous data quality monitoring. Provide extensive data analysis and profiling across a wide range of data objects. Develop and implement the data quality framework and operating model. Focus on high levels of process automation to ensure data and results are up-to-date. Conduct extensive data analysis to detect incorrect patterns in critical data early. Facilitate matching or linking multiple data sources together for continuous DQ monitoring. Embed ongoing data quality monitoring by setting up mechanisms to track issues and trends. Conduct root cause analysis to understand causes of poor quality data. Train, coach, and support data owners and stewards in managing data quality. Essential Skills/Experience: Experience developing and supporting the creation of data quality dashboards in Power BI, by extracting data from various Global SAP systems into Snowflake and develop rules for identifying DQ issues using Acceldata or something similar. Demonstrated experience & domain expertise within data management disciplines, including the three pillars of data quality, data governance and data architecture. Advanced programming skills in T-SQL or similar, to support data quality rule creation. Advanced data profiling and analysis skills evidenced by use of at least one data profiling analysis tool. For example, Adera, DataIKU or Acceldata. Strong ETL automation and reconciliation experience. Expert in extracting and manipulating and joining data in all its various formats. Excellent visualizing experience, using Power BI or similar for monitoring and reporting data quality issues. Key aspect of the role is to create self-serve data quality dashboards for the business to use for defect remediation and trending. Excellent written and verbal communication skills with the ability to influence others. to achieve objectives. Experience in Snowflake or similar for data lakes. Strong desire to improve the quality of data and to identify the causes impacting good data quality. Experience of Business and IT partnering for the implementation of Data Quality KPIs and visualisations. Strong Team member management skills with a good attention to detail. Ability to work in fast-paced, dynamic environment and manage multiple streams of work simultaneously. Experience of working in a global organisation, preferably within the pharmaceuticals industry. Experience of working in global change projects. Extensive knowledge of data quality with the ability to develop and mature the data quality operating model and framework. Knowledge of at least one standard data quality tool. For example, Acceldata, Alteryx, Aperture, Trillium, Ataccama or SAS Viya. Desirable Skills/Experience: Using one of the following data lineage or governance tools or similar. For example, Talend or Collibra. Experience in working in a complex MDG SAP data environment. Experience of any of the following for data cleansing – Winshuttle or Aperture Working within a lean environment and knowledge of data governance methodologies and standards. Knowledge of automation and scheduling tools. Extensive knowledge of risk and data compliance. Experience in data observability using AI pattern detection. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, innovation is at the heart of everything we do. We embrace change by trialing new solutions with patients and business needs in mind. Our diverse workforce is united by curiosity, sharing findings, and scaling fast. Be part of a digitally-enabled journey that impacts society, the planet, and our business by delivering life-changing medicines. Ready to make a difference? Apply now!
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job description: Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ͏ 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ͏ 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ͏ Deliver No. Performance Parameter Measure 1.Continuous Integration, Deployment & Monitoring of Software100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan2.Quality & CSATOn-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation3.MIS & Reporting100% on time MIS & report generation Mandatory Skills: Talend DI . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Inspire Brands: Inspire Brands is disrupting the restaurant industry through digital transformation and operational efficiencies. The company’s technology hub, Inspire Brands Hyderabad Support Center, India, will lead technology innovation and product development for the organization and its portfolio of distinct brands. The Inspire Brands Hyderabad Support Center will focus on developing new capabilities in data science, data analytics, eCommerce, automation, cloud computing, and information security to accelerate the company’s business strategy. Inspire Brands Hyderabad Support Center will also host an innovation lab and collaborate with start-ups to develop solutions for productivity optimization, workforce management, loyalty management, payments systems, and more. Job Description RESPONSIBILITIES: Design, develop and maintain reliable automated data solutions based on the identification, collection and evaluation of business requirements. Including but not limited to data models, database objects, stored procedures and views. Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components Support and troubleshoot the data environment (including periodically on call) Document technical artifacts for developed solutions Good interpersonal skills; comfort and competence in dealing with different teams within the organization. Requires an ability to interface with multiple constituent groups and build sustainable relationships Versatile, creative temperament, ability to think out-of-the box while defining sound and practical solutions. Ability to master new skills Proactive approach to problem solving with effective influencing skills Familiar with Agile practices and methodologies EDUCATION AND EXPERIENCE REQUIREMENTS: Four year degree in Information Systems, Finance / Mathematics, Computer Science or similar 3-5 years of experience in Data Engineering REQUIRED KNOWLEDGE, SKILLS or ABILITIES: Advanced SQL queries, scripts, stored procedures, materialized views, and views Focus on ELT to load data into database and perform transformations in database Ability to use analytical SQL functions Snowflake experience a plus Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming Experience with DevOps models utilizing a CI/CD tool Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob) Talend, Apache Airflow, Azure Data Factory, and BI tools like Tableau preferred Analyze data models We are looking for a Senior Data Engineer for the Enterprise Data Organization to build and manage data pipeline (Data ingest, data transformation, data distribution, quality rules, data storage etc.) for Azure cloud-based data platform. The candidate will require to possess strong technical, analytical, programming and critical thinking skills. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python Database: SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines. ETL & Integration: AecorSoft, Talend, DBT, Databricks (Optional), Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS. Supply Chain: IMS + Shipment functional knowledge good to have. Supporting Technologies: Erwin, Collibra, Data Governance, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Build the future of the AI Data Cloud. Join the Snowflake team. We’re looking for a self-driven, team-oriented Business Systems Analyst /Functional Consultant with strong communication skills to join our Financial Systems team. In this role, you will support and enhance Snowflake’s financial systems , with a primary focus on Workday Financials —specifically across Procure to Pay (P2P) , Expenses , and Accounts Payable (AP) processes. WHY JOIN OUR TEAM AT SNOWFLAKE? As the Business Systems Analyst, you will partner closely with Finance and Accounting teams to gather business requirements, configure system solutions, and drive process improvements. You’ll also collaborate with cross-functional stakeholders—including IT, Product, and Operations—to ensure effective execution of finance system initiatives. This role requires a solid understanding of financial processes, a commitment to policy and control adherence, and the ability to deliver scalable, efficient solutions. IN THIS ROLE AT SNOWFLAKE, YOU WILL: Designing and maintenance of systems implementations and automation supporting Procurement, Expenses, and Accounts Payable functions. Providing training, documentation, and support to Finance business users Partnering with stakeholders from Finance, IT, Sales, Operations, Product Management, Engineering teams to support a new product, process or system Manage the Workday Finance product to support Finance/Accounting programs. Work closely with the Finance team to understand functional requirements and execute them through successful implementation, testing, and deployment. Manage projects end to end; ensure coverage of all facets of testing, and successful end-user experience. Manage security administration for Finance. Translating business requirements and ensuring that automated solutions meet the needs of the business. Troubleshoot problems, test, and work with businesses to resolve issues. Responsible for understanding business requirements, configuring the Workday Finance product, and creating documentation as needed. Leverage business knowledge and expertise to identify opportunities for process improvements. Facilitate review sessions with functional owners, subject matter experts, and end-user representatives. Coordinate and perform system testing to ensure requirements are met. Work closely with the project team during user acceptance testing (UAT); including tracking issues through to resolution and securing end-user acceptance signoff. Provide guidance and training to finance team members, managers, and employees on processes and tools, and the future capabilities of the product. WE WOULD LOVE TO HEAR FROM YOU IF YOU HAVE: At least 5-10 years of business systems analyst experience in a global organization At least 5 years of professional experience supporting a Finance/Accounting organization. Possess a good understanding of financial processes such as Procurement, Expense, AP, accounting practices, and SOX compliance. Experience with Workday Financials or similar ERP systems is a must. In addition to Workday, previous experience with large scale ERP (SAP, Oracle, NetSuite), SaaS applications (Coupa, Xactly, RevPro, Adaptive Insights), and BI tools (Tableau, Talend) are preferable. Knowledge of project management methodologies (waterfall or agile) a plus. Remain composed in a dynamic environment and communicate successfully to all levels of team members BS/BA in Finance, Accounting, IT, or related fields. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. This position is based in Bengaluru and will require some on-site work. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5 + years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5 + years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Data Quality Assurance Analyst Career Level :: D Introduction to role: Are you ready to play a pivotal role in transforming data quality within a global pharmaceutical environment? Join our Operations Global Data Office as a Senior Data Quality Assurance Analyst, where you'll be instrumental in developing solutions to enhance data quality within our SAP systems. Your expertise will drive the creation of data quality rules and dashboards, ensuring alignment to standards and governance policies. Collaborate with data stewards and business owners to develop code for monitoring and measuring data quality, focusing on root cause analysis and prevention of data issues. Are you solution-oriented and passionate about testing? This is your chance to make a significant impact! Accountabilities: Develop and support the creation of data quality dashboards in Power BI by extracting data from various Global SAP systems into Snowflake. Work extensively with collaborators to define requirements for continuous data quality monitoring. Provide extensive data analysis and profiling across a wide range of data objects. Develop and implement the data quality framework and operating model. Focus on high levels of process automation to ensure data and results are up-to-date. Conduct extensive data analysis to detect incorrect patterns in critical data early. Facilitate matching or linking multiple data sources together for continuous DQ monitoring. Embed ongoing data quality monitoring by setting up mechanisms to track issues and trends. Conduct root cause analysis to understand causes of poor quality data. Train, coach, and support data owners and stewards in managing data quality. Essential Skills/Experience: Experience developing and supporting the creation of data quality dashboards in Power BI, by extracting data from various Global SAP systems into Snowflake and develop rules for identifying DQ issues using Acceldata or something similar. Demonstrated experience & domain expertise within data management disciplines, including the three pillars of data quality, data governance and data architecture. Advanced programming skills in T-SQL or similar, to support data quality rule creation. Advanced data profiling and analysis skills evidenced by use of at least one data profiling analysis tool. For example, Adera, DataIKU or Acceldata. Strong ETL automation and reconciliation experience. Expert in extracting and manipulating and joining data in all its various formats. Excellent visualizing experience, using Power BI or similar for monitoring and reporting data quality issues. Key aspect of the role is to create self-serve data quality dashboards for the business to use for defect remediation and trending. Excellent written and verbal communication skills with the ability to influence others. to achieve objectives. Experience in Snowflake or similar for data lakes. Strong desire to improve the quality of data and to identify the causes impacting good data quality. Experience of Business and IT partnering for the implementation of Data Quality KPIs and visualisations. Strong Team member management skills with a good attention to detail. Ability to work in fast-paced, dynamic environment and manage multiple streams of work simultaneously. Experience of working in a global organisation, preferably within the pharmaceuticals industry. Experience of working in global change projects. Extensive knowledge of data quality with the ability to develop and mature the data quality operating model and framework. Knowledge of at least one standard data quality tool. For example, Acceldata, Alteryx, Aperture, Trillium, Ataccama or SAS Viya. Desirable Skills/Experience: Using one of the following data lineage or governance tools or similar. For example, Talend or Collibra. Experience in working in a complex MDG SAP data environment. Experience of any of the following for data cleansing – Winshuttle or Aperture Working within a lean environment and knowledge of data governance methodologies and standards. Knowledge of automation and scheduling tools. Extensive knowledge of risk and data compliance. Experience in data observability using AI pattern detection. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, innovation is at the heart of everything we do. We embrace change by trialing new solutions with patients and business needs in mind. Our diverse workforce is united by curiosity, sharing findings, and scaling fast. Be part of a digitally-enabled journey that impacts society, the planet, and our business by delivering life-changing medicines. Ready to make a difference? Apply now! Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.
These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.
The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.
In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)
As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.