Home
Jobs

81 Talend Jobs in Bengaluru - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. This position is based in Bengaluru and will require some on-site work. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5 + years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5 + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5 + years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5+ years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5+ years experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Summary Responsible for designing, building, delivering and maintaining software applications & services. Responsible for software lifecycle including activities such as requirement analysis, documentation/procedures and implementation. Job Description Roles and Responsibilities In This Role, You Will Collaborate with system engineers, frontend developers and software developers to implement solutions that are aligned with and extend shared platforms and solutions Apply principles of SDLC and methodologies like Lean/Agile/XP, CI, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques Writes codes that meets standards and delivers desired functionality using the technology selected for the project Build features such as web services and Queries on existing tables Understand performance parameters and assess application performance Work on core data structures and algorithms and implement them using language of choice Education Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with basic experience. Desired Characteristics Technical Expertise Experience-3+ Years Frontend - Angular & React, .NET(Mandatory) Backend - Talend ETL tool (Mandatory) Search Engine - Solr DB - Microsoft SQL Server, Postgres Build & Deployment tools - Jenkins, Cruise control, Octopus Aware of methods and practices such as Lean/Agile/XP, etc. Prior work experience in an agile environment, or introductory training on Lean/Agile. Aware of and able to apply continuous integration (CI). General understanding of the impacts of technology choice to the software development life cycle. Business Acumen Has the ability to break down problems and estimate time for development tasks. Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team. Displays understanding of the project's value proposition for the customer. Shows commitment to deliver the best value proposition for the targeted customer. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision Personal/Leadership Attributes Voices opinions and presents clear rationale. Uses data or factual evidence to influence. Learns organization vision statement and decision making framework. Able to understand how team and personal goals/objectives contribute to the organization vision. Completes assigned tasks on time and with high quality. Takes independent responsibility for assigned deliverables. Has the ability to break down problems and estimate time for development tasks. Seeks to understand problems thoroughly before implementing solutions. Asks questions to clarify requirements when ambiguities are present. Identifies opportunities for innovation and offers new ideas. Takes the initiative to experiment with new software frameworks Adapts to new environments and changing requirements. Pivots quickly as needed. When coached, responds to need & seeks info from other sources Write code that meets standards and delivers desired functionality using the technology selected for the project. Inclusion and Diversity GE HealthCare is an Equal Opportunity Employer where inclusion Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. Additional Information Relocation Assistance Provided: No Show more Show less

Posted 3 weeks ago

Apply

9.0 - 14.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As a Data Architect, you will design and implement scalable, cloud-native data solutions that handle petabyte-scale datasets. You will lead architecture discussions, build robust data pipelines, and work closely with cross-functional teams to deliver enterprise-grade data platforms. Your work will directly support analytics, AI/ML, and real-time data processing needs across global clients. Key Responsibilities Translate complex data and analytics requirements into scalable technical architectures. Design and implement cloud-native architectures for real-time and batch data processing. Build and maintain large-scale data pipelines and frameworks using modern orchestration tools (e.g., Airflow, Oozie). Define strategies for data modeling, integration, metadata management, and governance. Optimize data systems for cost-efficiency, performance, and scalability. Leverage cloud services (AWS, Azure, GCP) including Azure Synapse, AWS Redshift, BigQuery, etc. Implement data governance frameworks covering quality, lineage, cataloging, and access control. Work with modern big data technologies (e.g., Spark, Kafka, Databricks, Snowflake, Hadoop). Collaborate with data engineers, analysts, DevOps, and business stakeholders. Evaluate and adopt emerging technologies to improve data architecture. Provide architectural guidance in cloud migration and modernization projects. Lead and mentor engineering teams and provide technical thought Skills and Experience : Bachelor's or Masters in Computer Science, Engineering, or related field. 10+ years of experience in data architecture, engineering, or platform roles. 5+ years of experience with cloud data platforms (Azure, AWS, or GCP). Proven experience building scalable enterprise data platforms (data lakes/warehouses). Strong expertise in distributed computing, data modeling, and pipeline optimization. Proficiency in SQL and NoSQL databases (e.g., Snowflake, SQL Server, Cosmos DB, DynamoDB). Experience with data integration tools like Azure Data Factory, Talend, or Informatica. Hands-on experience with real-time streaming technologies (Kafka, Kinesis, Event Hub). Expertise in scripting/programming languages such as Python, Spark, Java, or Scala. Deep understanding of data governance, security, and regulatory compliance (GDPR, HIPAA, CCPA). Strong communication, presentation, and stakeholder management skills. Ability to lead multiple projects simultaneously in an agile environment. (ref:hirist.tech) Show more Show less

Posted 4 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake. Experience5-8 Years.

Posted 4 weeks ago

Apply

0.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Data Engineer - DBT (Data Build Tool) Experience0-5 Years Location:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.

Posted 4 weeks ago

Apply

10.0 - 15.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:SQL, AWS Redshift, PostgreSQL Experience10-15 Years Location:Bangalore : SQL, AWS Redshift, PostgreSQL

Posted 4 weeks ago

Apply

4.0 - 6.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About your role This role serves as a member of Fidelity India team under PSO umbrella, supporting Fidelity Clearing Canada (FCC) technology team in a technical support and developer capacity. You will work to embed innovation across the business, maintaining consistency and standards to maximise business benefits. You will ensure the seamless operation of automated workflows, scripts, and orchestration processes for FCC applications currently in production. This includes proactive monitoring, rapid incident resolution, scripting and automation development, collaboration with cross-functional teams, and continuous improvement efforts to enhance system performance, reliability, and security. The goal is to maintain optimal functionality, minimize downtime, and contribute to the overall efficiency of automated systems, aligning with organizational objectives and standards. You will also be responsible for the development, enhancements, and maintenance of application solutions for internal and external clients. You will work to progress Fidelitys PSO and FCC Technology support team agenda by: Application Development Support Ensure that all requests raised by clients and users are handled timely and appropriately by possessing technical knowledge of operating systems, applications, and software development lifecycle. Provide technical support to teams within the organization, and to external clients when required. Update technical documents and procedures to reflect current state. Provide support for application deployments. Assist with systems integration when needed. Collaborate with On-Site Application Development Support Team. Appian Application Support Troubleshoot, fix and enhance the defects raised in Appian based applications; Understand the differences between REST, SOAP and the basic design principles in integrating to Web Services; Debug issues in Interfaces, Process Models and integrations and provide short term/long term solutions. Identify chokepoints and provide design recommendations to enhance the performance of the Application Provide technical guidance to junior developers as and when required; Defect Remediation Remediate defects based on business and client priorities to address service disruptions, incidents, and problems. Client Experience Deliver quality customer service interactions to our internal and external customers to create a positive experience. Take ownership of solving a customers problem promptly; use all available resources to achieve the best outcome. About You Skills and Knowledge Strong technical insight and experience to inform, guide, challenge and support technical decisions. Strong analytical, conceptual, and innovative problem-solving abilities. Strong attention to detail. Ability to work independently while being in a team environment. Excellent communication skills both written and oral; ability to effectively communicate technical material to non-technical users. Goal-oriented and a self-starter. Ability to quickly learn, adapt and change to meet the needs of a changing environment. Ability to explain complex ideas to those with limited IT and systems knowledge. Excellent problem-solving skills. Customer service oriented. Ability to work in a fast-paced environment without direct supervision. Development or Support experience in the Canadian Financial industry is an asset. Track record of actively seeking opportunities for process improvements, efficiency gains, and system optimizations in the context of automation and orchestration Experience and Qualifications Job Related Experience Minimum Requirement: 4+ years Must Have: 3+ years of experience as a developer or programmer/support engineer, including 2+ years of experience in the brokerage securities/Asset Management industry. 2+ Years of Appian BPM (or similar) Hands-On Development Experience 1+ years of experience and intermediate level knowledge of Java/J2EE development, including Spring, Hibernate, MyBatis, JPA, RESTful API, Spring Boot. Strong Hands-On knowledge of SQL and database platforms such as: MySQL, SQL Server, Oracle, database design Handy knowledge of Unix/Linux operating system and Shell scripts Exposure to Automated testing, DevOps, Change Management concepts Experience with Agile Development Methodologies. Nice to Have: Experience with the following: PowerBI, Talend, ETL, Data Warehouse, Control-M uniFide, Salesforce or Dataphile platform would be an asset. Working knowledge of HTML and Adaptive/Responsive Design DocuSign and document management platforms Atlassian stack (JIRA, Confluence) Hands-on expert in creating high performance web applications leveraging React, Angular 2. Some knowledge of concepts such as TypeScript, Bootstrap Grid System, Dependency Injections, SPA (Single Page Application). Experience with cloud-based implementations. Experience in setting up Secure File Transfer Protocols (SFTP) and file delivery. AWS would be an asset. Education: First degree level (Bachelors degree) or equivalent in Computer Science Knowledge of the financial service industry Dynamic Working This role is categorised as Hybrid (Office/Remote).

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: Data Architect – Data Integration & Engineering Location: Hybrid Experience: 8+ years Job Summary: We are seeking an experienced Data Architect specializing in data integration, data engineering, and hands-on coding to design, implement, and manage scalable and high-performance data solutions. The ideal candidate should have expertise in ETL/ELT, cloud data platforms, big data technologies, and enterprise data architecture. Key Responsibilities: 1. Data Architecture & Design: Develop enterprise-level data architecture solutions, ensuring scalability, performance, and reliability. Design data models (conceptual, logical, physical) for structured and unstructured data. Define and implement data integration frameworks using industry-standard tools. Ensure compliance with data governance, security, and regulatory policies (GDPR, HIPAA, etc.). 2. Data Integration & Engineering: Implement ETL/ELT pipelines using Informatica, Talend, Apache Nifi, or DBT. Work with batch and real-time data processing tools such as Apache Kafka, Kinesis, and Apache Flink. Integrate and optimize data lakes, data warehouses, and NoSQL databases. 3. Hands-on Coding & Development: Write efficient and scalable code in Python, Java, or Scala for data transformation and processing. Optimize SQL queries, stored procedures, and indexing strategies for performance tuning. Build and maintain Spark-based data processing solutions in Databricks and Cloudera ecosystems. Develop workflow automation using Apache Airflow, Prefect, or similar tools. 4. Cloud & Big Data Technologies: Work with cloud platforms such as AWS (Redshift, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Dataflow). Manage big data processing using Cloudera, Hadoop, HBase, and Apache Spark. Deploy containerized data services using Kubernetes and Docker. Automate infrastructure using Terraform and CloudFormation. 5. Governance, Security & Compliance: Implement data security, masking, and encryption strategies. Define RBAC (Role-Based Access Control) and IAM policies for data access. Work on metadata management, data lineage, and cataloging. Required Skills & Technologies: Data Engineering & Integration: ETL/ELT Tools: Informatica, Talend, Apache Nifi, DBT Big Data Ecosystem: Cloudera, HBase, Apache Hadoop, Spark Data Streaming: Apache Kafka, AWS Kinesis, Apache Flink Data Warehouses: Snowflake, AWS Redshift, Google Big Query, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB, Cassandra Programming & Scripting: Languages: Python, Java, Scala Scripting: Shell, PowerShell, Bash Frameworks: PySpark, SparkSQL Cloud & DevOps: Cloud Platforms: AWS, Azure, GCP Containerization & Orchestration: Kubernetes, Docker CI/CD Pipelines: Jenkins, GitHub Actions, Terraform, CloudFormation Security & Governance: Compliance Standards: GDPR, HIPAA, SOC 2 Data Cataloging: Collibra, Alation Access Controls: IAM, RBAC, ABAC Preferred Certifications: AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Databricks Certified Data Engineer Associate/Professional Cloudera Certified Data Engineer Informatica Certified Professional Education & Experience: Bachelor's/Master’s degree in Computer Science/ MCA, Data Engineering, or a related field. 8+ years of experience in data architecture, integration, and engineering. Proven expertise in designing and implementing enterprise-scale data solutions. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives. DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. We are seeking committed and talented MDM Engineers to join our new FoundationX team, which lies at the heart of DigitalX. As a member within FoundationX, you will be playing a critical role in ensuring our MDM systems are operational, scalable and continue to contain the right data to drive business value. You will play a pivotal role in building maintaining and enhancing our MDM systems. This position is based in India and may require on-site work from time to time. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. Purpose And Scope As a Data and Analytics Tester, you will play a critical role in validating the accuracy, functionality, and performance of our BI, Data Warehousing and ETL systems. You’ll work closely with FoundationX Data Engineers, analysts, and developers to ensure that our QLIK, Power BI, and Tableau reports meet high standards. Additionally, your expertise in ETL tools (such as Talend, DataBricks) will be essential for testing data pipelines. Essential Job Responsibilities Development Ownership: Support testing for Data Warehouse and MI projects. Collaborate with senior team members. Administer multi-server environments. Test Strategy And Planning Understand project requirements and data pipelines. Create comprehensive test strategies and plans. Participate in data validation and user acceptance testing (UAT). Data Validation And Quality Assurance Execute manual and automated tests on data pipelines, ETL processes, and models. Verify data accuracy, completeness, and consistency. Ensure compliance with industry standards. Regression Testing Validate changes to data pipelines and analytics tools. Monitor performance metrics. Test Case Design And Execution Create detailed test cases based on requirements. Collaborate with development teams to resolve issues. Maintain documentation. Data Security And Privacy Validate access controls and encryption mechanisms. Ensure compliance with privacy regulations. Collaboration And Communication Work with cross-functional teams. Communicate test progress and results. Continuous Improvement And Technical Support Optimize data platform architecture. Provide technical support to internal users. Stay updated on trends in full-stack development and cloud platforms. Qualifications Required Bachelor’s degree in computer science, information technology, or related field (or equivalent experience.) 3-5+ + years proven experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. 3-5++ years' experience in using BI Development, ETL Development, Qlik, PowerBI including DAX and Power Automate (MS Flow) or PowerBI alerts or equivalent technologies. Experience with QLIK Sense and QLIKView, Tableau application and creating data models. Familiarity with Business Intelligence and Data Warehousing concepts (star schema, snowflake schema, data marts). Knowledge of SQL, ETL frameworks and data integration techniques. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Exposure to at least 1-2 full large complex project life cycles. Experience with test management software (e.g., qTest, Zephyr, ALM). Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Manual testing (test case design, execution, defect reporting). Awareness of automated testing tools (e.g., Selenium, JUnit). Experience with data warehouses and understanding of BI/DWH systems. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Preferred: - Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous. Certifications in BI tools or testing methodologies. Knowledge of cloud-based BI solutions (e.g., Azure, AWS) Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making. Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans Show more Show less

Posted 1 month ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 10 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 1 month ago

Apply

6 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Position : ETL Support Engineer Experience : 6+ Years Location : Bengaluru Work Mode : Remote Employment Type : Contract Notice Period : Immediate - 15days Must Have Skills - Banking Domain, Git, Snowflake, ETL, JIRA/ServiceNow, SQL Job Responsibilities: · Understanding of the ETL process · Perform functional, Integration and Regression testing for ETL Processes. · Validate and ensure data quality and consistency across different data sources and targets. · Develop and execute test cases for ETL workflows and data pipeline. · Load Testing: Ensuring that the data warehouse can handle the volume of data being loaded and queried under normal and peak conditions. · Scalability: Testing for the scalability of the data warehouse in terms of data growth and system performance. External Skills And Expertise Required Qualification: · Bachelor's degree in Computer Science, Engineering, Mathematics or related discipline or its foreign equivalent · 6+ years of experience with ETL support and development · ETL Tools: Experience with popular ETL tools like Talend, Microsoft SSIS, · Experience with relational databases (e.g., SQL Server, Postgres). · Experience with Snowflake Dataware house. · Proficiency in writing complex SQL queries for data validation, comparison, and manipulation · Familiarity with version control systems like Git, Github to manage changes in test cases and scripts. · Knowledge of defect tracking tools like JIRA, ServiceNow. · Banking domain experience is a must. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Extensive experience in designing and maintaining data architectures, data processing pipelines at a large scale. Familiarity with databases and ETL processes along with data warehousing concepts Proven experience in designing and developing data pipelines using technologies like Apache Spark or Apache kafka or ETL tools like Talend/informatica or similar frameworks. Strong understanding of relational and NoSQL databases, data modelling and ETL processes. Assist in designing, developing and maintaining data pipelines. Collaborate with senior team members to understand data requirements and contribute to implementation of data solutions. Participate in troubleshooting and resolving data related issues. Monitoring dashboards and responding to alerts Lead the design and implementation of complex and scalable data pipelines. Proficiency in programming languages like Python, Pyspark and SQL. Experience with Cloud platforms preferably with AWS. Optimize and tune existing data platforms for enhanced performance and scalability. Experience with Pyspark or Python. Experience with ETL tools like Talend or informatica Ability to work in a fast-paced environment. Strong business acumen includes written and verbal communication skills. Strong interpersonal and organizational skills Trained in AWS technologies, and knowledge on ET tools like Talend and Informatica. Excellent verbal and non-verbal communication skills. Including the ability to communicate with people at all levels. Requires a Bachelor's Degree in a technical or equivalent discipline. This is a summary of the primary accountabilities and requirements for this position. The company reserves the right to modify or amend accountabilities and requirements at anytime at its sole discretion based on business needs. Any part of this job description is subject to possible modification to reasonably accommodate individuals with disabilities. About Us About us: Our story Mouser Electronics, founded in 1964, is a globally authorized distributor of semiconductors and electronic components for over 1,200 industry-leading manufacturer brands. This year marks the company's 60th anniversary. We specialize in the rapid introduction of the newest products and technologies targeting the design engineer and buyer communities. Mouser has 28 offices located around the globe. We conduct business in 23 different languages and 34 currencies. Our global distribution centre is equipped with state-of-the-art wireless warehouse management systems that enable us to process orders 24/7, and deliver nearly perfect pick-and-ship operations. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Key Responsibilities Work closely with clients to understand their business requirements and design data solutions that meet their needs. Develop and implement end-to-end data solutions that include data ingestion, data storage, data processing, and data visualization components. Design and implement data architectures that are scalable, secure, and compliant with industry standards. Work with data engineers, data analysts, and other stakeholders to ensure the successful delivery of data solutions. Participate in presales activities, including solution design, proposal creation, and client presentations. Act as a technical liaison between the client and our internal teams, providing technical guidance and expertise throughout the project lifecycle. Stay up-to-date with industry trends and emerging technologies related to data architecture and engineering. Develop and maintain relationships with clients to ensure their ongoing satisfaction and identify opportunities for additional business. Understands Entire End to End AI Life Cycle starting from Ingestion to Inferencing along with Operations. Exposure to Gen AI Emerging technologies. Exposure to Kubernetes Platform and hands on deploying and containorizing Applications. Good Knowledge on Data Governance, data warehousing and data modelling. Requirements Bachelor's or Master's degree in Computer Science, Data Science, or related field. 10+ years of experience as a Data Solution Architect, with a proven track record of designing and implementing end-to-end data solutions. Strong technical background in data architecture, data engineering, and data management. Extensive experience on working with any of the hadoop flavours preferably Data Fabric. Experience with presales activities such as solution design, proposal creation, and client presentations. Familiarity with cloud-based data platforms (e.g., AWS, Azure, Google Cloud) and related technologies such as data warehousing, data lakes, and data streaming. Experience with Kubernetes and Gen AI tools and tech stack. Excellent communication and interpersonal skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences. Strong problem-solving skills, with the ability to analyze complex data systems and identify areas for improvement. Strong project management skills, with the ability to manage multiple projects simultaneously and prioritize tasks effectively. Tools and Tech Stack Hadoop Ecosystem Data Architecture and Engineering: Preferred: Cloudera Data Platform (CDP) or Data Fabric. Tools: HDFS, Hive, Spark, HBase, Oozie. Data Warehousing Cloud-based: Azure Synapse, Amazon Redshift, Google Big Query, Snowflake, Azure Synapsis and Azure Data Bricks On-premises: , Teradata, Vertica Data Integration And ETL Tools Apache NiFi, Talend, Informatica, Azure Data Factory, Glue. Cloud Platforms Azure (preferred for its Data Services and Synapse integration), AWS, or GCP. Cloud-native Components Data Lakes: Azure Data Lake Storage, AWS S3, or Google Cloud Storage. Data Streaming: Apache Kafka, Azure Event Hubs, AWS Kinesis. HPE Platforms Data Fabric, AI Essentials or Unified Analytics, HPE MLDM and HPE MLDE AI And Gen AI Technologies AI Lifecycle Management: MLOps: MLflow, KubeFlow, Azure ML, or SageMaker, Ray Inference tools: TensorFlow Serving, K Serve, Seldon Generative AI Frameworks: Hugging Face Transformers, LangChain. Tools: OpenAI API (e.g., GPT-4) Kubernetes Orchestration and Deployment: Platforms: Azure Kubernetes Service (AKS)or Amazon EKS or Google Kubernetes Engine (GKE) or Open Source K8 Tools: Helm CI/CD For Data Pipelines And Applications Jenkins, GitHub Actions, GitLab CI, or Azure DevOps Show more Show less

Posted 1 month ago

Apply

2 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

What We Offer At Magna, you can expect an engaging and dynamic environment where you can help to develop industry-leading automotive technologies. We invest in our employees, providing them with the support and resources they need to succeed. As a member of our global team, you can expect exciting, varied responsibilities as well as a wide range of development prospects. Because we believe that your career path should be as unique as you are. Group Summary Magna is more than one of the world’s largest suppliers in the automotive space. We are a mobility technology company built to innovate, with a global, entrepreneurial-minded team. With 65+ years of expertise, our ecosystem of interconnected products combined with our complete vehicle expertise uniquely positions us to advance mobility in an expanded transportation landscape. Job Responsibilities Job Introduction As a Master Data Management Junior Analyst, your primary responsibility will be to ensure the accuracy, completeness, and consistency of the company's master data. This includes data related to customers, products, suppliers, and other critical business entities. You will assist on various skills such as Power BI Visualization, Data Analyzation, ETL processes for various groups and divisions, maintain the quality of the governed data, map new data from different ERPs to a universal standardized language and business groups. We are looking for a team-oriented person who is also able to work on initiatives independently to achieve results. We require a willingness to learn and grow in a fast-paced global environment. Major Responsibilities Have an open-minded attitude and explore new ways of how data can be harmonized, ability to adapt to changing priorities and work in a fast-paced environment. Be able to support Magna into improving the data quality, identifying and resolving data inconsistencies and anomalies of different master data domains. Collaborating with IT teams to implement and maintain master data management systems. Analyze source systems data to make meaningful connections participating in data migration and integration projects using SQL or basic Excel analysis tools. Administration and monitoring of data integration flows (Talend and Automic) Support business in daily activities to improve data quality and processes such as meeting with different Magna groups to support their needs when it comes to Master Data Follow Magna Master Data Management Governance Standards by analyzing ERP data and making meaningful connections between the data and our global applications. Work together with various cross functions (e.g. Finance, Real Estate, Purchasing etc.) to support existing master data processes Work Experience College diploma or university degree in the field of computer science. Experience in data analysis role Knowledge and Education Knowledge in a Data Visualization tool (Power BI) and having DAX Queries knowledge is an advantage. Database: Familiarity with database concepts and experience working with relational databases Data analysis: Proficiency in SQL and MS Excel is mandatory for this role Technical Aptitude: Basic understanding of programming concepts and experience with data manipulation tools, such as ETL (Extract, Transform, Load) tools (Talend), can be beneficial. Java knowledge is an advantage. Excellent English communication skills (written and verbal) and good German communication skills. Knowledge in Master Data Management’s best practices is an advantage. Work Experience Experience in Data Analysis role is preferred. Skills and Competencies Excellent MS Office knowledge. Proficiency in data analysis tools such as SQL and Excel. Basic Power BI knowledge is an advantage MDM governance experience is an advantage Basic Understanding of data integration within application frameworks. Profound English communication skills “Hands-on” mentality Functional understanding of processes Planning, organizing, and delegation. Communication skills. Teamwork. Critical thinking and problem-solving skills, decision making. Negotiation skills and political acuity to ensure that conflicts project and stakeholder requirements are managed to ensure a successful outcome. Adaptability, Stress tolerance. Awareness, Unity, Empowerment At Magna, we believe that a diverse workforce is critical to our success. That’s why we are proud to be an equal opportunity employer. We hire on the basis of experience and qualifications, and in consideration of job requirements, regardless of, in particular, color, ancestry, religion, gender, origin, sexual orientation, age, citizenship, marital status, disability or gender identity. Magna takes the privacy of your personal information seriously. We discourage you from sending applications via email to comply with GDPR requirements and your local Data Privacy Law. Worker Type Regular / Permanent Group Magna Corporate Show more Show less

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have skills :Databricks Unified Data Analytics Platform Good to have skills :Talend ETL, Apache Spark, PySpark Minimum 5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams, make team decisions, and provide solutions to problems. Engaging with multiple teams, you will contribute to key decisions and provide solutions for your immediate team and across multiple teams. In this role, you will have the opportunity to showcase your creativity and technical expertise in designing and building applications. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Design, build, and configure applications to meet business process and application requirements Collaborate with cross-functional teams to gather and define application requirements Develop and implement software solutions using the Databricks Unified Data Analytics Platform Perform code reviews and ensure adherence to coding standards Troubleshoot and debug applications to identify and resolve issues Optimize application performance and ensure scalability Document technical specifications and user manuals for applications Stay updated with emerging technologies and industry trends Train and mentor junior developers to enhance their technical skills Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform Good To Have Skills:Experience with PySpark, Apache Spark, Talend ETL Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

8 - 11 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a skilled Guidewire Manager with a strong background in the insurance domain and extensive knowledge of traditional ETL tools. The ideal candidate will have 8-11 years of experience. ### Roles and Responsibility Lead and manage Guidewire implementation projects, ensuring alignment with business objectives and technical requirements. Oversee the design, development, and maintenance of data warehousing solutions. Collaborate with cross-functional teams to gather and analyze business requirements. Develop and implement ETL processes using tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Ensure data quality, integrity, and security across all data warehousing and ETL processes. Provide technical guidance and mentorship to team members. ### Job Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. Strong background in the insurance domain. Hands-on experience with ETL tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Excellent understanding of data warehousing architecture and best practices. Proven leadership and project management skills. Strong analytical and problem-solving abilities. Experience with Guidewire implementation projects. Knowledge of additional ETL tools and technologies. Certification in relevant ETL tools or data warehousing technologies. Good exposure to any ETL tools. Good to have knowledge about Life insurance. Understanding of Business Intelligence, Data Warehousing, and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance domain. Prior Client facing skills, Self-motivated and collaborative.

Posted 1 month ago

Apply

4 - 8 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 3+ years of Strategy Consulting experience at a consulting firm Experience on projects showcasing skills across any two of these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy Desirable to have skills in any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience one or more technologies in the data governance space is preferred:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Mandatory knowledge of IT concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential CDMP Certification from DAMA desirable

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of Full-time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Data Analysis & InterpretationGood to Have Skills :Snowflake Data WarehouseJob Requirements :Key Responsibilities :1 Enable himself with the Accenture Standards and Policies of working in a Project and client environment2 Work with Project Manager and Project Lead to get his Client user accounts created3 Lead the overall Snowflake Transformation journey for the customer4 Design Develop the new solution in Snowflake Datawarehouse 5 Prepare and test strategy and an implementation plan for the solution6 Play role of a End to End Data Engineer Technical Experience :1 2 Years of Hands-on Experience in SNOWFLAKE Datawarehouse Design and Development Projects specifically2 4 Years of Hands-on Experience in SQL Programming Language PLSQL3 1 Years of Experience in JavaScripting or any programming languages Python, ReactJS, Angular4 Good Understanding and Concepts of Cloud Datawarehouse and Datawarehousing concepts and Dimensional Modelling concepts5 1 Year Experience in ETL Technologies - Informatica or DataStage or Talend or SAP BODS or Abinitio, etc Professional Attributes :1 Should be fluent in English communication2 Should have handled direct Client Interactions in the past3 Should be clear in Written Communications4 Should be having strong interpersonal skills5 Should be conscious of European Professional Etiquettes Educational Qualification:Minimum 15 years of Full-time educationAdditional Info :Exposure to AWS and Amazon S3 and other Amazon Cloud Hosting Products related to Analytics or DBs Qualifications Minimum 15 years of Full-time education

Posted 1 month ago

Apply

3 - 5 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 – Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | Accenture Qualifications Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA

Posted 1 month ago

Apply

Exploring Talend Jobs in Bengaluru

Are you a job seeker looking to dive into the world of data integration and management? Bengaluru, also known as the Silicon Valley of India, offers a plethora of opportunities for talend professionals. With a booming IT sector and a high demand for skilled data engineers, Bengaluru is a hotspot for talend jobs.

Job Market Overview

  • Major Hiring Companies: Companies like Infosys, Wipro, Accenture, and IBM are actively hiring talend professionals in Bengaluru.
  • Salary Ranges: Talend developers in Bengaluru can expect to earn between INR 6-12 lakhs per annum, depending on their experience and skill level.
  • Job Prospects: The job market for talend professionals in Bengaluru is promising, with a steady growth in demand for data integration and management experts.

Key Industries in Demand

  • IT: The IT industry in Bengaluru is a major employer of talend professionals.
  • E-commerce: With the rise of e-commerce platforms, there is a high demand for data integration specialists in this sector.
  • Healthcare: Healthcare organizations are increasingly relying on data management solutions, creating opportunities for talend professionals.

Cost of Living Context

Bengaluru offers a comparatively lower cost of living compared to other major cities in India, making it an attractive destination for job seekers. Affordable housing options and a vibrant social scene make it an ideal city to kickstart your career in talend.

Remote Work Opportunities

In the wake of the COVID-19 pandemic, many companies in Bengaluru are offering remote work options for talend professionals. This flexibility allows you to work from the comfort of your home while still enjoying the benefits of a thriving job market.

Transportation Options

Bengaluru boasts a well-connected public transportation system, including buses, metro, and cabs, making it easy for job seekers to commute to their workplaces.

Emerging Trends and Future Prospects

As technology continues to evolve, talend professionals in Bengaluru can expect to see an increase in demand for their skills. Emerging trends like cloud-based data integration and AI-driven analytics are shaping the future job market for talend experts.

If you are looking to embark on a rewarding career in data integration and management, Bengaluru is the place to be. Don't miss out on the exciting talend jobs in Bengaluru – apply now and take your career to new heights!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies