Home
Jobs

81 Talend Jobs in Bengaluru

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Job Title: ETL Tester Experience: 4-5 years We are looking for a highly skilled and detail-oriented ETL Tester with 4–5 years of hands-on experience in validating data pipelines, ETL processes, and data warehousing systems. The ideal candidate will have a strong understanding of data extraction, transformation, and loading processes, and will be responsible for ensuring data accuracy, completeness, and quality across various system. Key responsibilities: -Review and understand ETL requirements and source-to-target mappings. -Develop and execute comprehensive test cases, test plans, and test scripts for ETL processes. -Validate data accuracy, transformations, and data flow between source and target systems. -Perform data validation, data reconciliation, and back-end/database testing using SQL. -Identify, document, and track defects using tools like JIRA or HP ALM. -Work closely with developers, business analysts, and data engineers to resolve issues. -Automate testing processes where applicable using scripting or ETL testing tools Required Skills: -4–5 years of hands-on experience in ETL testing. Strong SQL skills for writing complex queries and performing data validation. -Experience with ETL tools (e.g., Informatica, SSIS, Talend). -Familiarity with Data Warehousing concepts and Data Migration projects. -Proficiency in defect tracking and test management tools (e.g., JIRA, ALM, TestRail). -Knowledge of automation frameworks or scripting for ETL test automation is a plus. -Good understanding of Agile/Scrum methodology. Preferred Qualifications: -Bachelor's degree in Computer Science, Information Systems, or a related field. -Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. -Exposure to reporting and BI tools (Tableau, Power BI) is an advantage. Job Type: Full-time Schedule: Day shift Morning shift Work Location: In person

Posted 22 hours ago

Apply

1.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As an Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Develop and execute automated test cases for ETL processes. Validate data transformation, extraction, and loading accuracy. Collaborate with data engineers and QA teams to understand ETL workflows. Identify and document defects and inconsistencies. Maintain test documentation and support manual testing efforts. Design and implement automated ETL test scripts and frameworks. Validate end-to-end data flows and transformation logic. Collaborate with data architects, developers, and QA teams. Integrate ETL testing into CI/CD pipelines where applicable. Analyze test results and troubleshoot data issues. Lead the architecture and development of advanced ETL automation frameworks. Drive best practices in ETL testing and data quality assurance. Mentor and guide junior consultants and analysts. Collaborate with stakeholders to align testing strategies with business goals. Integrate ETL testing within DevOps and CI/CD pipelines. Desired Qualifications 1 to 9 years’ experience in ETL testing and automation. Knowledge of ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database querying. Basic scripting or programming skills (Python, Shell, etc.). Good analytical and communication skills. Strong SQL skills and experience with ETL tools like Informatica, Talend, or DataStage. Proficiency in scripting languages for automation (Python, Shell, etc.). Knowledge of data warehousing concepts and best practices. Strong problem-solving and communication skills. Expert knowledge of ETL tools and strong SQL proficiency. Experience with automation scripting and data validation techniques. Strong leadership, communication, and stakeholder management skills. Familiarity with big data technologies and cloud platforms is a plus. Location and way of working: Base location: Bangalore This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 23 hours ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 1 day ago

Apply

1.0 - 5.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journey A Snowflake Developer is responsible for designing and developing data solutions within the Snowflake cloud data platform They play a critical role in helping organizations to store, process, and analyze their data effectively and efficiently Responsibilities:Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modeling solutionsParticipate in the design and implementation of data migration strategiesEnsure the quality of custom solutions through the implementation of appropriate testing and debugging proceduresProvide technical support and troubleshoot issues as neededStay up-to-date with the latest developments in the Snowflake platform and data warehousing technologiesContribute to the ongoing improvement of development processes and best practices Requirements:Experience in data warehousing and data analyticsStrong knowledge of SQL and data warehousing conceptsExperience with Snowflake, or other cloud data platforms, is preferredAbility to analyze and interpret dataExcellent written and verbal communication skillsAbility to work independently and as part of a teamStrong attention to detail and ability to work in a fast-paced environment

Posted 2 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Job Summary A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Analytics and Insights Managed Services team bring a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Job Description To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. JD for ETL tester at Associate level As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Minimum Degree Required : Bachelor Degree Degree Preferred : Bachelors in Computer Engineering Minimum Years of Experience : 7 year(s) of IT experience Certifications Required : NA Certifications Preferred : Automation Specialist for TOSCA, Lambda Test Certifications Required Knowledge/Skills Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Knowledge/Skills Demonstrates extensive knowledge and/or a proven record of success in the following areas: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Sapiens is on the lookout for a Developer (ETL) to become a key player in our Bangalore team. If you're a seasoned ETL pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ L&P division, for more information about it, click here: https://sapiens.com/solutions/life-and-pension-software/ What You’ll Do Designing, and developing of core components/services that are flexible, extensible, multi-tier, scalable, high-performance and reliable applications of an advanced complex software system, called ALIS both in R&D and Delivery. Good understanding in the ETL Advanced concepts and administration activities to support R&D/Project. Experience in understanding of different ETL tools (min 4) and Advanced transformations, Good in Talend, SAP BODS to support R&D/Project To be able to resolve all ETL code and administration issue. Ability to resolve complex Reporting challenges. Ability to create full-fledged dashboards with story boards/lines, drill down, linking etc.; design tables, views or datamarts to support these dashboards Ability understand, propose data load strategies which improves performance and visualizations Ability to performance tune SQL, ETL & Report, Universes To understand Sapiens Intelligence Product and support below points Understands the Transaction Layer model for all modules Universe the Universe Model for all modules Should have End to End Sapiens Intelligence knowledge Should be able to independently demo or give training for Sapiens Intelligence product. Should be an SME in Sapiens Intelligence as a Product What To Have For This Position. Must have Skills. 3 - 5 years of IT experience. Should have experience to understand the Advanced concepts of Insurance and has good command over at least all Business / Functional Areas (like NB, claims, Finance etc,.) Should have experience with developing a complete DWH ETL lifecycle Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. - using ETL tools such as Talend, BODS, SSIS etc. Experience or knowledge in Bigdata related tools like (Spark, Hive, Kafka, Hadoop, Horton works, Python, R) would be good to go Should have experience in developing SAP BO or any Reporting tool knowledge Should be able to implement reusability, parameterization, workflow design, etc. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and Low/High level design documents Experience in understanding complex source system data structures – preferably in Insurance services (Insurance preferred) Experience in Data Analysis, Data Modeling and Data Mart design Strong database development skills like complex SQL queries, complex stored procedures Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Ability to work with minimal guidance or supervision in a time critical environment. Willingness to travel and work at various customer sites across the globe. About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform to over 600 customers in more than 30 countries. Sapiens’ platform offers pre-integrated, low-code capabilities to accelerate customers’ digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000 employees globally. For More information visit us on www.sapiens.com . Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer : Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com . Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3 - 5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3 - 5+ years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3 -5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5+ years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 3 days ago

Apply

12.0 - 20.0 years

35 - 50 Lacs

Bengaluru

Hybrid

Naukri logo

Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA

Posted 4 days ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Very good experience on Continuous Flow Graph tool used for point based development. Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Over all 8 Years and Relevant 5+ years Extract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems. Work with different data formats, including structured, semi-structured, and unstructured data Preferred technical and professional experience Effective communication and presentation skills. Industry expertise / specialization

Posted 5 days ago

Apply

0 years

2 - 7 Lacs

Bengaluru

On-site

GlassDoor logo

QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Performing all testing activities for initiatives across one or more assigned projects, utilizing processes, methods, metrics and software that ensure the quality, reliability and systems safety and security and Hoverfly Component and contract Testing Embedded Cassandra Component Testing on multiple repositor. QA Tester to ensure the accuracy and functionality of ETL jobs migrated from Talend to Python and SQL queries converted to Snowflake. The role involves validating data integrity, performance, and the seamless transition of ETL processes, working closely with developers skilled in Python and SQL. Test strategy formulation will include decomposing the business and technical requirements into test case scenarios, defining test data requirements, managing test case creation, devising contingencies plans and other preparation activities. Development of the test case execution plan, test case execution, managing issues, and status metrics. Working with a global team and responsible for directing/reviewing the test planning and execution work efforts of an offshore team. Communicating effectively with business units, IT Development, Project Management and other support staff on testing timelines, deliverables, status and other information. Assisting in the project quality reviews for your assigned applications Assessing risk to the project based on the execution and validation and making appropriate recommendations Ability to interpret quality audits, drive improvements and change, and facilitate test methodology discussions across the business unit Providing project implementation support on an as needed basis and Assisting with application training of new resources. Ability to create and manage project plans and activity timelines and Investigating, monitoring, reporting and driving solutions to issues Acting as a liaison between the Line of Business testing resources and the development team Identifying and creating risk mitigation activities, and developing and implementing process improvements. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

Data Strategy & Data Governance Manager Join our team in Technology Strategy for an exciting career opportunity to enable our most strategic clients to realize exceptional business value from technology Practice: Technology Strategy & Advisory, Capability Network Areas of Work: Data Strategy Level: Manager | Location: Bangalore / Gurgaon / Mumbai / Pune / Chennai / Hyderabad / Kolkata | Years of Exp: 10 to 15 years Explore an Exciting Career at Accenture Do you believe in creating an impactAre you a problem solver who enjoys working on transformative strategies for global clientsAre you passionate about being part of an inclusive, diverse and collaborative culture Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analyticsand the data that fuels it allto power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for today's connectedlandscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to clients technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualification Your experience counts! MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred:Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Data Management - Data Integration->Talend Preferred Skills: Technology->Data Management - Data Integration->Talend

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement ETL processes using Talend ETL tool.- Collaborate with cross-functional teams to gather and analyze data requirements.- Optimize and troubleshoot ETL processes for performance and efficiency.- Create and maintain technical documentation for ETL processes.- Assist in testing and debugging ETL processes to ensure data accuracy. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration concepts.- Experience with data modeling and database design.- Knowledge of SQL and database querying.- Familiarity with data warehousing concepts. Additional Information:- The candidate should have a minimum of 2 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suite. You will create automation strategy, automated scripts, and support data and environment configuration. Additionally, you will participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Conduct thorough testing of data warehouse ETL processes.- Develop and execute test cases, test plans, and test scripts.- Identify and document defects, issues, and risks.- Collaborate with cross-functional teams to ensure quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of SQL and database concepts.- Experience with ETL tools such as Informatica or Talend.- Knowledge of data warehousing concepts and methodologies.- Experience in testing data integration, data migration, and data transformation processes. Additional Information:- The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. Load transformed data into target systems while ensuring data integrity and accuracy. Collaborate with data analysts and business stakeholders to understand data needs and requirements. Optimize ETL processes for enhanced performance and efficiency. Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. Document ETL processes, data models, and workflows for future reference and team collaboration. Qualifications: • Bachelor's degree in computer science, Information Technology, or a related field. 3-5 years of experience in ETL development and data integration. Experience with Big Data technologies such as Hadoop or Spark. Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. Familiarity with data visualization tools such as Tableau or Power BI. Hands-on experience with Snowflake for data warehousing and analytics

Posted 1 week ago

Apply

10.0 - 15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Experience :- 10 - 15 Years. Job Description Lead execution of the assigned projects & responsible for end to end execution Lead, guide and support the design and implementation of targeted strategies including identification of change impacts to people, process, policy, and structure, stakeholder identification and alignment, appropriate communication and feedback loops, success measures, training, organizational readiness, and long-term sustainability Manage the day-to-day activities, including scope, financials (e.g. business case, budget), resourcing (e.g. Full-time employees, roles and responsibilities, utilization), timelines and toll gates and risks Implement project review and quality assurance to ensure successful execution of goals and stakeholder satisfaction Consistently report and review progress to the Program Lead, Steering group and relevant stakeholders Will involve in more than one projects or will work across a portfolio of projects Identify improvement and efficiency opportunities across the projects Analyze data, evaluate results, and develop recommendations and road maps across multiple workstreams Build and maintain effective partnerships with key cross functional leaders and project team members across functions such as Finance & Technology Experience Experience of working as a Project Manager/ Scrum Master as a service provider (not in internal projects) Knowledge of functional supply chain and planning processes, including ERP/MRP, capacity planning, and managing planning activities with contract manufacturers - Good to have. Experience in implementing ERP systems such as SAP and Oracle - good to have. Not mandatory. Experience in systems integration and ETL tools such as Informatica and Talend a plus Experience with data mapping and systems integration a plus Functional knowledge of supply chain or after sales service operations a plus Outstanding drive, excellent interpersonal skills and the ability to communicate effectively, both verbally and in writing, and to immediately contribute in a team environment An ability to prioritize and perform well in a fast-paced environment, while maintaining a high level of client focus Demonstrable track record of delivery and impact in managing/delivering transformation, with minimum 6-9 years’ experience in project management & business transformation Experience in managing Technology Projects(data analysis, visualization, app development etc) along with atleast in one function such as Procurement Domain, process improvement, continuous improvement, change management, operating model design Has performed the role of a scrum master or managed a project having scrum teams Has managed projects with stakeholders in multi-location landscape Past experience in managing analytics projects will be a huge plus Education Understanding & application of Agile and waterfall methodology Exposure to tools and applications such as Microsoft Project, Jira, Confluence, PowerBI, Alteryx Understanding of Lean Six Sigma Preferably a post graduate - MBA though not mandatory Expectation Excellent interpersonal (communication and presentation) and organizational skills · Problem solving abilities and a can-do attitude Confident, proactive self-starters, comfortable in managing and engaging others Effective in engaging, partnering with and influencing stakeholders across the matrix up to VP level Ability to move fluidly between big picture and detail always keeping the end goal in mind Inclination toward collaborative partnership, and able to help establish/be part of high performing teams for impact Highly diligent with close eye for detail. Delivers quality outputs Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Power Bi and AAS expert (Strong SC or Specialist Senior) Should have hands-on experience of Data Modelling in Azure SQL Data Warehouse and Azure Analysis Service Should be able to write and test Dex queries. Should be able generate Paginated Reports in Power BI Should have minimum 3 Years working experience in delivering projects in Power Bi Must Have:- 3 to 8 years of experience working on design, develop, and deploy ETL processes on Databricks to support data integration and transformation. Optimize and tune Databricks jobs for performance and scalability. Experience with Scala and/or Python programming languages. Proficiency in SQL for querying and managing data. Expertise in ETL (Extract, Transform, Load) processes. Knowledge of data modeling and data warehousing concepts. Implement best practices for data pipelines, including monitoring, logging, and error handling. Excellent problem-solving skills and attention to detail. Excellent written and verbal communication skills Strong analytical and problem-solving abilities. Experience in version control systems (e.g., Git) to manage and track changes to the codebase. Document technical designs, processes, and procedures related to Databricks development. Stay current with Databricks platform updates and recommend improvements to existing process. Good to Have:- Agile delivery experience. Experience with cloud services, particularly Azure (Azure Databricks), AWS (AWS Glue, EMR), or Google Cloud Platform (GCP). Knowledge of Agile and Scrum Software Development Methodologies. Understanding of data lake architectures. Familiarity with tools like Apache NiFi, Talend, or Informatica. Skills in designing and implementing data models.

Posted 1 week ago

Apply

4.0 - 9.0 years

0 - 1 Lacs

Bengaluru

Hybrid

Naukri logo

Design and implement highly scalable ELK (ElasticSearch, Logstash, and Kibana) stack and ElastiCache solutions Grafana: Create different visualization and dashboards according to the Client needs Experience of scripting languages like JavaScript, Python, PowerShell, etc. Should be able to work with API, shards etc. in Elasticsearch. Architecting data structures using Elastic Search and ElastiCache Query languages and writing complex queries with joins that deals with a large amount of data End to end Low-level design, development, administration, and delivery of ELK based reporting solutions Strong exposure to writing talend queries. Elastic query for data Analysis Creating Elasticsearch index templates. Index life cycle management Managing and monitoring Elasticsearch cluster Experience with Analyzers & Shards Experience in solving performance issues on large set of data indexes Strong expertise in Python scripting Strong experience in installing and configuring ELK on bare metal and clouds (GCP, AWS & AZURE) Strong experience in using Elastic search Indices, Elastic search APIs, Kibana Dashboards, Log stash and Log Beats Good experience in using or creating plugins for ELK like authentication and authorization plugins Good experience in enhancing Open-source ELK for custom capabilities Experience in provisioning automation frameworks such as Kubernetes or docker Experience working with JSON

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 321498 We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. "¢ Work closely with Data modeller to ensure data models support the solution design "¢ Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. "¢ Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. "¢ Develop documentation and artefacts to support projects Minimum Skills Required"¢ ADF "¢ Fivetran (orchestration & integration) "¢ SQL "¢ Snowflake DWH

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

ETL testers with Automation Testing experience in DBT and snowflake. DBT, experience in Talend and snowflake.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives. DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Junior Data Engineer, you will play a crucial role in assisting in the design, build, and maintenance of our data infrastructure focusing on BI and DWH capabilities. Working with the Senior Data Engineer, your foundational expertise in BI, Databricks, PySpark, SQL, Talend and other related technologies, will be instrumental in driving data-driven decision-making across the organization. You will play a pivotal role in building maintaining and enhancing our systems across the organization. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT solutions. Essential Job Responsibilities Collaborate with FoundationX Engineers to design and maintain scalable data systems. Assist in building robust infrastructure using technologies like PowerBI, Qlik or alternative, Databricks, PySpark, and SQL. Contribute to ensuring system reliability by incorporating accurate business-driving data. Gain experience in BI engineering through hands-on projects. Data Modelling and Integration: Collaborate with cross-functional teams to analyse requirements and create technical designs, data models, and migration strategies. Design, build, and maintain physical databases, dimensional data models, and ETL processes specific to pharmaceutical data. Cloud Expertise: Evaluate and influence the selection of cloud-based technologies such as Azure, AWS, or Google Cloud. Implement data warehousing solutions in a cloud environment, ensuring scalability and security. BI Expertise: Leverage and create PowerBI, Qlik or equivalent technology for data visualization, dashboards, and self-service analytics. Data Pipeline Development: Design, build, and optimize data pipelines using Databricks and PySpark. Ensure data quality, reliability, and scalability. Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition. Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning. Data Strategy Contribution: Contribute to the organization’s data strategy by identifying opportunities for data-driven insights and improvements. Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond. Design, develop and implement robust and scalable data analytics using modern technologies. Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions. Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible. Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments. Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog. Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization. Stay-up to date on the latest trends and technologies in data engineering and cloud platforms. Qualifications Required Bachelor's degree in computer science, Information Technology, or related field (master’s preferred) or equivalent experience 1-3+ years of experience in data engineering with a strong understanding of BI technologies, PySpark and SQL, building data pipelines and optimization. 1-3 +years + experience in data engineering and integration tools (e.g., Databricks, Change Data Capture) 1-3+ years + experience of utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus. Experience with relational and non-relational databases. Any relevant cloud-based integration certification at foundational level or above. (Any QLIK or BI certification, AWS certified DevOps engineer, AWS Certified Developer, Any Microsoft Certified Azure qualification, Proficient in RESTful APIs, AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP or any relevant certification) Experience in MuleSoft (Anypoint platform, its components, Designing and managing API-led connectivity solutions). Experience in AWS (environment, services and tools), developing code in at least one high level programming language. Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools Experience with Azure services related to computing, networking, storage, and security Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management Preferred Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).

Posted 1 week ago

Apply

10.0 years

5 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Job Description Position Overview We are seeking a highly skilled and experienced Data Architect with expertise in cloud-based solutions. The ideal candidate will design, implement, and optimize our data architecture to meet the organization's current and future needs. This role requires a strong background in data modeling, transformation, and governance, along with hands-on experience with modern cloud platforms and tools such as Snowflake, Spark, Data Lakes, and Data Warehouses. The successful candidate will also establish and enforce standards and guidelines across data platforms to ensure consistency, scalability, and best practices. Exceptional communication skills are essential to collaborate across cross-functional teams and stakeholders. Key Responsibilities Design and Implementation: Architect and implement scalable, secure, and high-performance cloud data platforms, integrating data lakes, data warehouses, and databases. Develop comprehensive data models to support analytics, reporting, and operational needs. Data Integration and Transformation: Lead the design and execution of ETL/ELT pipelines using tools like, Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark to process and transform data efficiently. Integrate diverse data sources into cohesive and reusable datasets for business intelligence and machine learning purposes. Standards and Guidelines: Establish, document, and enforce standards and guidelines for data architecture, Data modeling, transformation, and governance across all data platforms. Ensure consistency and best practices in data storage, integration, and security throughout the organization. Data Governance: Establish and enforce data governance standards, ensuring data quality, security, and compliance with regulatory requirements. Implement processes and tools to manage metadata, lineage, and data access controls. Cloud Expertise: Utilize Snowflake for advanced analytics and data storage needs, ensuring optimized performance and cost efficiency. Leverage modern cloud platforms to manage data lakes and ensure seamless integration with other services. Collaboration and Communication: Partner with business stakeholders, data engineers, and analysts to gather requirements and translate them into technical designs. Clearly communicate architectural decisions, trade-offs, and progress to both technical and non-technical audiences. Continuous Improvement: Stay updated on emerging trends in cloud and data technologies, recommending innovations to enhance the organization’s data capabilities. Optimize existing architectures to improve scalability, performance, and maintainability. Qualifications Technical Skills: Strong expertise in data modeling (conceptual, logical, physical) and data architecture design principles. Proficiency in Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark, Snowflake, and cloud-based data platforms. Experience with data lakes, data warehouses, and relational and NoSQL databases. Experience with relational(PGSQL/Oracle) / NoSQL(Couchbase/Cassandra) databases Solid understanding of data transformation techniques and ETL/ELT pipelines. Proficiency in DevOps / DataOps / MLOps tools. Standards and Governance: Experience establishing and enforcing data platform standards, guidelines, and governance frameworks. Proven ability to align data practices with business goals and regulatory compliance. Communication: Exceptional written and verbal communication skills to interact effectively with technical teams and business stakeholders. Experience: 5+ years of experience in data architecture, with a focus on cloud technologies. Proven track record of delivering scalable, cloud-based data solutions. Education: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Preferred Qualifications Certification in Snowflake, AWS data services, Any RDBMS / NoSQL, AI/ML, Data Governance. Familiarity with machine learning workflows and data pipelines. Experience working in Agile development environments. Job Type: Full-time Schedule: Monday to Friday Night shift Rotational shift Work Location: In person

Posted 1 week ago

Apply

Exploring Talend Jobs in Bengaluru

Are you a job seeker looking to dive into the world of data integration and management? Bengaluru, also known as the Silicon Valley of India, offers a plethora of opportunities for talend professionals. With a booming IT sector and a high demand for skilled data engineers, Bengaluru is a hotspot for talend jobs.

Job Market Overview

  • Major Hiring Companies: Companies like Infosys, Wipro, Accenture, and IBM are actively hiring talend professionals in Bengaluru.
  • Salary Ranges: Talend developers in Bengaluru can expect to earn between INR 6-12 lakhs per annum, depending on their experience and skill level.
  • Job Prospects: The job market for talend professionals in Bengaluru is promising, with a steady growth in demand for data integration and management experts.

Key Industries in Demand

  • IT: The IT industry in Bengaluru is a major employer of talend professionals.
  • E-commerce: With the rise of e-commerce platforms, there is a high demand for data integration specialists in this sector.
  • Healthcare: Healthcare organizations are increasingly relying on data management solutions, creating opportunities for talend professionals.

Cost of Living Context

Bengaluru offers a comparatively lower cost of living compared to other major cities in India, making it an attractive destination for job seekers. Affordable housing options and a vibrant social scene make it an ideal city to kickstart your career in talend.

Remote Work Opportunities

In the wake of the COVID-19 pandemic, many companies in Bengaluru are offering remote work options for talend professionals. This flexibility allows you to work from the comfort of your home while still enjoying the benefits of a thriving job market.

Transportation Options

Bengaluru boasts a well-connected public transportation system, including buses, metro, and cabs, making it easy for job seekers to commute to their workplaces.

Emerging Trends and Future Prospects

As technology continues to evolve, talend professionals in Bengaluru can expect to see an increase in demand for their skills. Emerging trends like cloud-based data integration and AI-driven analytics are shaping the future job market for talend experts.

If you are looking to embark on a rewarding career in data integration and management, Bengaluru is the place to be. Don't miss out on the exciting talend jobs in Bengaluru – apply now and take your career to new heights!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies