Home
Jobs

836 Talend Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

20 - 35 Lacs

Noida

Work from Office

Naukri logo

Role & responsibilities Collaborate with customers' Business and IT teams to define and gather solution requirements for custom development, B2B/ETL/EAI, and cloud integration initiatives using the Adeptia Integration Platform. Analyze, interpret, and translate customer business needs into scalable and maintainable technical solution designs, aligned with best practices and the capabilities of the Adeptia platform. Storyboard and present solutions to customers and prospects, ensuring a clear understanding of proposed designs and technical workflows. Provide end-to-end project leadership, including planning, tracking deliverables, and coordinating efforts with offshore development teams as required. Review implementation designs and provide architectural guidance and best practices to the implementation team to ensure high-quality execution. Actively assist and mentor customers in configuring and implementing the Adeptia platform, ensuring alignment with technical and business objectives. Solutions Lead (Implementation Services Team) Full Time (Permanent) Noida, India Offer expert recommendations on design and configuration to ensure successful deployment and long-term maintainability of customer solutions. Define clear project requirements, create work breakdown structures, and establish realistic delivery timelines. Delegate tasks effectively, and manage progress against daily, weekly, and monthly targets, ensuring the team remains focused and productive. Serve as a liaison among customers, internal stakeholders, and offshore teams to maintain alignment, track progress, and ensure delivery meets both quality and timeline expectations. Monitor project baselines, identify and mitigate risks, and lead participation in all Agile ceremonies, including sprint grooming, planning, reviews, and retrospectives. Maintain a hands-on technical role, contributing to development activities and conducting detailed code reviews to ensure technical soundness and optimal performance. Take full ownership of assigned projects, driving them to successful, on-time delivery with high quality standards. Preferred candidate profile Technical Proven experience in designing and developing integration solutions involving Cloud/SaaS applications, APIs, SDKs, and legacy systems. Skilled in implementing SOA/EAI principles and integration patterns in B2B, ETL, EAI, and Cloud Integration using platforms such as Adeptia, Talend, MuleSoft or similar tools. Good hands-on experience with Core Java (version 8+) and widely-used Java frameworks including Spring (version 6+), Hibernate (version 6+). Proficient in SOA, RESTful and SOAP web services and related technologies including JMS, SAAJ, JAXP, and XML technologies (XSD, XPath, XSLT, parsing). Strong command over SQL and RDBMS (e.g., Oracle, MySQL, PostgreSQL). Solid understanding of Enterprise Service Bus (ESB) concepts and messaging technologies such as Kafka and RabbitMQ. Familiar with transport protocols including HTTPS, Secure FTP, POP/IMAP/SMTP, and JDBC. Skilled in working with Windows and Linux operating systems, and experienced with application servers such as JBoss, Jetty, and Tomcat. Solid understanding of security best practices, including authentication, authorization, data encryption, and compliance frameworks relevant to enterprise integrations. Basic understanding of modern JavaScript frameworks such as React, with the ability to collaborate effectively on front-end and full-stack development scenarios Non-Technical Strong communication and interpersonal skills with over 5 years of direct client-facing experience. Adept at gathering, clarifying, and understanding business requirements and 2 translating them into actionable technical specifications. Skilled in aligning cross-functional teams to ensure effective execution and stakeholder satisfaction. Over 4 years of hands-on experience with Agile methodologies, including participation in Daily Standups, Product Backlog Refinement, Sprint Planning, Retrospectives, Storyboarding, and User Story writing. Proven ability to deliver high-quality results in fast- paced and iterative environments. More than 4 years of experience as a Team Lead, currently managing a team. Responsible for task delegation, delivery tracking, performance feedback, and fostering team collaboration across time zones. Experienced in creating comprehensive Requirement Documents, Architecture Design Documents, and both High-Level and Low-Level Design specifications to support technical teams and business stakeholders. Demonstrated success in remote/distributed client environments, delivering under tight timelines and high-pressure scenarios. Known for a proactive mindset and solution-oriented approach to project challenges. Highly disciplined and committed to maintaining quality standards, with a focus on continuous testing, adherence to best practices, and application of standard design patterns. Exceptional attention to detail, paired with strong analytical, troubleshooting, and problem- solving skills that contribute to robust and reliable solution delivery. Fluent in English with the ability to clearly articulate thoughts, present ideas, and create well- structured documentation and visual design artifacts. Experienced in creating work schedules, providing performance feedback, and mentoring junior team members. Skilled in training new resources and managing resource allocation to optimize team productivity and performance. Good to have skills: Technical Familiarity with widely used integration standards such as EDI X12, EDIFACT, ebXML, iDOC, BAPI, etc., for B2B and enterprise system communication. Experience in implementing security features including API security, authentication and authorization mechanisms, and message and transport-level encryption. Understanding and experience with various Adeptia security models, including Native, LDAP, SAML, OAuth, IDP, Multiple IDP, Kerberos, encryption/decryption techniques, EAR packaging, KeyTools, digital signatures, and encoding/decoding mechanisms. Hands-on experience with design and prototyping tools such as Mockups, Draw.io, and Enterprise Architect to create process diagrams, solution designs, and documentation. Working knowledge of DevOps tools and practices, including Continuous Integration and Continuous Deployment (CI/CD) pipelines. Familiar with tools such as Git/GitHub, Maven, and Jenkins for version control, dependency management, and automated builds. In-depth experience with various deployment models, including clustered and non-clustered environments, multi-zone deployments, disaster recovery (DR) setups, DMZ configurations, load balancers, and cloud-based architectures. 3 Non-Technical Strong understanding of creating and maintaining development and quality assurance processes to ensure high standards in project delivery. Knowledge of implementing high availability solutions, including the design and management of real-time and batch processing systems. Self-driven, adaptable, and highly responsible with a strong focus on delivering quality outcomes while managing shifting priorities. Skilled at working closely with cross-functional teams to drive project success, ensuring clear communication and effective prioritization under tight deadlines. Adept at balancing multiple tasks and working effectively under pressure, with the ability to prioritize tasks and deliver results within strict timelines.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® Certified TM . Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. Location India About The Role The Technology Consultant, Entry is a professional level role responsible for supporting data integration activities throughout the deployment of Kinaxis solutions. The job incumbent has a foundational level of technical and domain knowledge and can navigate Kinaxis’ technology solutions and processes for data management and integration. They understand Kinaxis customers’ most pressing supply chain product offerings so that our customers can start to experience the immediate value of Kinaxis solutions What You Will Do Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications, with guidance from senior consultants. Perform integration configuration – mapping, loading, transforming and validating data required to support our customer’s unique system landscape for standard deployments. Demonstrate knowledge and proficiency in both the Kinaxis Integration Platform Suite, RR data model, and REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows, under the guidance of senior consultants. Assist data management and integration related activities including validation and testing of the solutions. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Support deployment workshops to help customers achieve immediate value from their investment. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. 1-3 years of experience in implementing or deploying software applications in the supply chain management space or experience in data integration activities for enterprise level systems. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Self-starter who shows initiative in their work and learning and can excel in a fast-paced work environment. Excellent problem-solving and critical thinking skills, able to synthesize a high volume of complex information to determine the best course of action. Works well in a team environment and can work effectively with people at all levels in an organization. Ability to communicate complex ideas effectively in English, both verbally and in writing. Ability to work virtually and plan for up to 70% travel What we are looking for Technical skills such as SQL, R, Java Script, Python, etc. Experience working with relational databases and JavaScript, an asset. Experience working with supply chain processes and manufacturing planning solutions such as RapidResponse, SAP, Oracle, or Blue Yonder applications to support supply chain activities. Progressive experience with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Some database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. Some experience with connection functionality to SAP (through BAPI / RFC), databases, files, web, SOAP. Open to travel 75% on average and 100% occasionally and can work effectively when working remotely from client sites. #EntryLevel, #Intermediate Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com . Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions. Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® Certified TM . Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. Location India About The Role The Technology Consultant, Entry is a professional level role responsible for supporting data integration activities throughout the deployment of Kinaxis solutions. The job incumbent has a foundational level of technical and domain knowledge and can navigate Kinaxis’ technology solutions and processes for data management and integration. They understand Kinaxis customers’ most pressing supply chain product offerings so that our customers can start to experience the immediate value of Kinaxis solutions What You Will Do Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications, with guidance from senior consultants. Perform integration configuration – mapping, loading, transforming and validating data required to support our customer’s unique system landscape for standard deployments. Demonstrate knowledge and proficiency in both the Kinaxis Integration Platform Suite, RR data model, and REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows, under the guidance of senior consultants. Assist data management and integration related activities including validation and testing of the solutions. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Support deployment workshops to help customers achieve immediate value from their investment. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. 1-3 years of experience in implementing or deploying software applications in the supply chain management space or experience in data integration activities for enterprise level systems. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Self-starter who shows initiative in their work and learning and can excel in a fast-paced work environment. Excellent problem-solving and critical thinking skills, able to synthesize a high volume of complex information to determine the best course of action. Works well in a team environment and can work effectively with people at all levels in an organization. Ability to communicate complex ideas effectively in English, both verbally and in writing. Ability to work virtually and plan for up to 70% travel What we are looking for Technical skills such as SQL, R, Java Script, Python, etc. Experience working with relational databases and JavaScript, an asset. Experience working with supply chain processes and manufacturing planning solutions such as RapidResponse, SAP, Oracle, or Blue Yonder applications to support supply chain activities. Progressive experience with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Some database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. Some experience with connection functionality to SAP (through BAPI / RFC), databases, files, web, SOAP. Open to travel 75% on average and 100% occasionally and can work effectively when working remotely from client sites. #EntryLevel, #Intermediate Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com . Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions. Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About Kinaxis Elevate your career journey by embracing a new challenge with Kinaxis. We are experts in tech, but it’s really our people who give us passion to always seek ways to do things better. As such, we’re serious about your career growth and professional development, because People matter at Kinaxis. In 1984, we started out as a team of three engineers based in Ottawa, Canada. Today, we have grown to become a global organization with over 2000 employees around the world, and support 40,000+ users in over 100 countries. As a global leader in end-to-end supply chain management, we enable supply chain excellence for all industries. We are expanding our team in Chennai and around the world as we continue to innovate and revolutionize how we support our customers. Our journey in India began in 2020 and we have been growing steadily since then! Building a high-trust and high-performance culture is important to us and we are proud to be Great Place to Work® Certified TM . Our state-of-the-art office, located in the World Trade Centre in Chennai, offers our growing team space for expansion and collaboration. Location India About The Role The Technology Consultant, Entry is a professional level role responsible for supporting data integration activities throughout the deployment of Kinaxis solutions. The job incumbent has a foundational level of technical and domain knowledge and can navigate Kinaxis’ technology solutions and processes for data management and integration. They understand Kinaxis customers’ most pressing supply chain product offerings so that our customers can start to experience the immediate value of Kinaxis solutions What You Will Do Participate in deep-dive customer business requirements discovery sessions and develop integration requirements specifications, with guidance from senior consultants. Perform integration configuration – mapping, loading, transforming and validating data required to support our customer’s unique system landscape for standard deployments. Demonstrate knowledge and proficiency in both the Kinaxis Integration Platform Suite, RR data model, and REST based API Integration capabilities, and support the client in identifying and implementing solutions best suited to individual data flows, under the guidance of senior consultants. Assist data management and integration related activities including validation and testing of the solutions. Collaborate with Kinaxis Support and/or Cloud Services teams to address client queries around security risks or security incidents. Support deployment workshops to help customers achieve immediate value from their investment. Liaise directly with customers and internal SMEs such as the Technology Architect through the project lifecycle. Skills and Qualifications we need Bachelor’s degree in Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics, Information Technology or a related field. Passion for working in customer-facing roles and able to demonstrate strong interpersonal, communication, and presentation skills. 1-3 years of experience in implementing or deploying software applications in the supply chain management space or experience in data integration activities for enterprise level systems. Understanding of the software deployment life cycle; including business requirements definition, review of functional specifications, development of test plans, testing, user training, and deployment. Self-starter who shows initiative in their work and learning and can excel in a fast-paced work environment. Excellent problem-solving and critical thinking skills, able to synthesize a high volume of complex information to determine the best course of action. Works well in a team environment and can work effectively with people at all levels in an organization. Ability to communicate complex ideas effectively in English, both verbally and in writing. Ability to work virtually and plan for up to 70% travel What we are looking for Technical skills such as SQL, R, Java Script, Python, etc. Experience working with relational databases and JavaScript, an asset. Experience working with supply chain processes and manufacturing planning solutions such as RapidResponse, SAP, Oracle, or Blue Yonder applications to support supply chain activities. Progressive experience with ETL tools such as Talend, OWB, SSISl SAP Data Services etc. Some database level experience extracting data from enterprise class ERP systems including SAP/APO, Oracle, and JDE. Some experience with connection functionality to SAP (through BAPI / RFC), databases, files, web, SOAP. Open to travel 75% on average and 100% occasionally and can work effectively when working remotely from client sites. #EntryLevel, #Intermediate Work With Impact: Our platform directly helps companies power the world’s supply chains. We see the results of what we do out in the world every day—when we see store shelves stocked, when medications are available for our loved ones, and so much more. Work with Fortune 500 Brands: Companies across industries trust us to help them take control of their integrated business planning and digital supply chain. Some of our customers include Ford, Unilever, Yamaha, P&G, Lockheed-Martin, and more. Social Responsibility at Kinaxis: Our Diversity, Equity, and Inclusion Committee weighs in on hiring practices, talent assessment training materials, and mandatory training on unconscious bias and inclusion fundamentals. Sustainability is key to what we do and we’re committed to net-zero operations strategy for the long term. We are involved in our communities and support causes where we can make the most impact. People matter at Kinaxis and these are some of the perks and benefits we created for our team: Flexible vacation and Kinaxis Days (company-wide day off on the last Friday of every month) Flexible work options Physical and mental well-being programs Regularly scheduled virtual fitness classes Mentorship programs and training and career development Recognition programs and referral rewards Hackathons For more information, visit the Kinaxis web site at www.kinaxis.com or the company’s blog at http://blog.kinaxis.com . Kinaxis welcomes candidates to apply to our inclusive community. We provide accommodations upon request to ensure fairness and accessibility throughout our recruitment process for all candidates, including those with specific needs or disabilities. If you require an accommodation, please reach out to us at recruitmentprograms@kinaxis.com. Please note that this contact information is strictly for accessibility requests and cannot be used to inquire about application statuses. Kinaxis is committed to ensuring a fair and transparent recruitment process. We use artificial intelligence (AI) tools in the initial step of the recruitment process to compare submitted resumes against the job description, to identify candidates whose education, experience and skills most closely match the requirements of the role. After the initial screening, all subsequent decisions regarding your application, including final selection, are made by our human recruitment team. AI does not make any final hiring decisions. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Position : Lead Data Engineer Experience : 7+ Years Location : Remote Summary We are looking for a Lead Data Engineer responsible for ETL processes and documentation in building scalable data warehouses and analytics capabilities. This role involves maintaining existing systems, developing new features, and implementing performance improvements. Key Responsibilities Build ETL pipelines using Fivetran and dbt for internal and client projects across platforms like Azure , Salesforce , and AWS . Monitor active production ETL jobs. Create and maintain data lineage documentation to ensure complete system traceability. Develop design/mapping documents for clear and testable development, QA, and UAT. Evaluate and implement new data integration tools based on current and future requirements. Identify and eliminate process redundancies to streamline data operations. Work with the Data Quality Analyst to implement validation checks across ETL jobs. Design and implement large-scale data warehouses , BI solutions, and Master Data Management (MDM) systems, including Data Lakes/Data Vaults . Required Skills & Qualifications Bachelor's degree in Computer Science, Software Engineering, Math, or a related field. 6+ years of experience in data engineering, business analytics, or software development. 5+ years of experience with strong SQL development skills . Hands-on experience in Snowflake and Azure Data Factory (ADF) . Proficient in ETL toolsets such as Informatica , Talend , dbt , and ADF . Experience with PHI/PII data and working in the healthcare domain is preferred. Strong analytical and critical thinking skills. Excellent written and verbal communication. Ability to manage time and prioritize tasks effectively. Familiarity with scripting and open-source platforms (e.g., Python, Java, Linux, Apache, Chef ). Experience with BI tools like Power BI , Tableau , or Cognos . Exposure to Big Data technologies : Snowflake (Snowpark) , Apache Spark , Hadoop , Hive , Sqoop , Pig , Flume , HBase , MapReduce . Show more Show less

Posted 1 week ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Job Type: Full-time, remote work arrangement, 12-month contract (extendable / renewable) role Reporting Line: Singapore Expected Start Date: September 2025 Our client, a globally renowned business school consistently ranked among the top in the world with campuses across the US, Europe, Middle East, and Asia, is hiring a Talend Technical Lead / Architect. This remote role reports to the Head of Data Operations and Digital Solutions Teams, based in Singapore. We are seeking a highly experienced Senior Talend Architect to lead the development, maintenance, and innovation of a scalable enterprise data management platform. The ideal candidate will have at least 8 years of hands-on experience with the Talend platform , including Talend Studio, TAC, TMC, API Services, Data Stewardship, Data Preparation, and Data Catalog. You should bring a minimum of 3 years in a senior or architect-level role , guiding enterprise-grade implementations. Other responsibilities but not limited to: Design and implement scalable data architecture and integration solutions using Talend Guide and mentor the development team on Talend best practices and architecture Ensure high data quality, cleansing, and enrichment processes Collaborate with cross-functional teams to support data consumption across tools like Power BI, Salesforce, and Eloqua Contribute to Agile development processes using SCRUM and Jira The ideal candidate MUST have : Data modeling, integration, and governance Talend job and flow development Solid understanding of databases (Oracle, MSSQL, AWS) Programming languages: Java, SQL, SOQL, Python Software development methodologies: BDD, TDD, Agile Version control: Git/GitHub Experience with service management tools (e.g., Jira) Familiarity with cloud environments and various operating systems Key Competencies: Fluent in English with excellent written and verbal communication Strong analytical and architectural problem-solving skills Able to convey complex technical topics to non-technical stakeholders Organized, autonomous, and self-driven with a high standard of work Strong documentation skills and attention to detail Team player with the ability to work in multicultural environments We carefully review every application and will reach out to shortlisted candidates who best match our clients’ requirements. If you believe this role aligns with your background, you’re welcome to apply directly through this job ad or send your CV to our Hiring Team at hiring@weconnectsearch.com For other exciting opportunities in Technology, Sales & Marketing, Business Support, Real Estate, or Education, visit our website at weconnectsearch.com or explore our Careers Page at careers.weconnectsearch.com. Singapore EA No. 19C9923 Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Location: Bengaluru, Hyderabad, Chennai & Pune ETL Development Lead : Having prior Lead exp is must to have (minimum 1 year) Experience with Leading and mentoring a team of Talend ETL developers. Providing technical direction and guidance on ETL/Data Integration development to the team. Designing complex data integration solutions using Talend & AWS. Collaborating with stakeholders to define project scope, timelines, and deliverables. Contributing to project planning, risk assessment, and mitigation strategies. Ensuring adherence to project timelines and quality standards. Strong understanding of ETL/ELT concepts, data warehousing principles, and database technologies. Design, develop, and implement ETL (Extract, Transform, Load) processes using Talend Studio and other Talend components. Build and maintain robust and scalable data integration solutions to move and transform data between various source and target systems (e.g., databases, data warehouses, cloud applications, APIs, flat files). Develop and optimize Talend jobs, workflows, and data mappings to ensure high performance and data quality. Troubleshoot and resolve issues related to Talend jobs, data pipelines, and integration processes. Collaborate with data analysts, data engineers, and other stakeholders to understand data requirements and translate them into technical solutions. Perform unit testing and participate in system integration testing of ETL processes. Monitor and maintain Talend environments, including job scheduling and performance tuning. Document technical specifications, data flow diagrams, and ETL processes. Stay up-to-date with the latest Talend features, best practices, and industry trends. Participate in code reviews and contribute to the establishment of development standards. Proficiency in using Talend Studio, Talend Administration Center/TMC, and other Talend components. Experience working with various data sources and targets, including relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL), NoSQL databases, AWS cloud platform, APIs (REST, SOAP), and flat files (CSV, TXT). Strong SQL skills for data querying and manipulation. Experience with data profiling, data quality checks, and error handling within ETL processes. Familiarity with job scheduling tools and monitoring frameworks. Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively within a team environment. Basic Understanding of AWS Services i.e. EC2 , S3 , EFS, EBS, IAM , AWS Roles , CloudWatch Logs, VPC, Security Group , Route 53, Network ACLs, Amazon Redshift, Amazon RDS, Amazon Aurora, Amazon DynamoDB. Understanding of AWS Data integration Services i.e. Glue, Data Pipeline, Amazon Athena , AWS Lake Formation, AppFlow, Step Functions

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

JR0124387 Manager, Technology Operations – Pune, India Do you view Integrations in the retail financial industry as more than just APIs and microservices, and instead see it as an opportunity to improve the Digital Banking customer experience? Do you feel empowered to quickly respond to product enhancement requests that impact integration, functionality issues and implementation of integrations to ensure new Digital banking capabilities are delivered to the customers in an agile fashion? If so, why not consider joining Western Unions as Manager Technology Operations . Western Union powers your pursuit. We are seeking a highly experienced and technically sound Full Stack Developer at the Manager level to lead and oversee L1 and L2 application support and enhancement activities for our core systems. This role focuses on issue resolution, maintenance, minor enhancements, and ensuring the stability and performance of production applications. While hands-on development will be limited, a strong full stack background is essential to guide the team effectively and address complex issues when needed. Role Responsibilities Lead and manage L1 and L2 support teams to ensure timely and effective resolution of application issues and service requests. Perform root cause analysis for recurring issues and proactively work on permanent fixes and enhancements. Oversee minor enhancements, patches, and configuration changes in collaboration with business and tech stakeholders. Manage incident response, escalation handling, and communication with business teams. Ensure application performance, uptime, and SLA adherence through monitoring and continuous improvement. Review and enhance support documentation, knowledge base articles, and operational runbooks. Collaborate with QA, DevOps, Infrastructure, and Development teams for deployment and environment stability. Serve as a technical point of escalation and provide hands-on support when needed. Track support metrics, prepare periodic reports, and recommend process improvements. Role Requirements 10+ years of experience in software development and support in technologies like Java, Spring Boot framework, Cassandra DB or similar databases, Multithreading, serialization, Externalization, Collections, etc, including at least 3 years in leadership or managerial role. Experience with J2EE technologies, JDBC, ORM, JAXB, DevOps tools, Kafka Experience with APIs, REST/JSON, Microservices, XML, XSLT, etc Sound understanding of application architecture and system integrations. Proven experience in managing support operations (L1/L2), production support, or enhancement teams. Experience with incident management tools (e.g., ServiceNow, JIRA), monitoring tools, and CI/CD pipelines. Excellent analytical, troubleshooting, and problem-solving skills. Strong communication and stakeholder management abilities. Exposure to ITIL or similar service management frameworks is a plus. Experience with unit testing and mocking frameworks Ability to evaluate business needs and achieve alignment among stakeholders. Excellent verbal and written communication skills for managing both direct and indirect talent effectively. Strong technical skills in architecture patterns, solutions design & development. Proven ability to understand business and ability to contribute to technology direction that drives measurable business improvements Excellent understanding of computer science fundamentals, data structures, algorithms, OOPs, and OOA/D Hands-on experience in designing and building microservices based architecture and platform developed and deployed at scale Hands-on experience in building a microservices based architecture and platform developed and deployed at scale. Ability to work in a fast paced, iterative development environment and adapt to changing business priorities and to thrive under pressure Team player with strong analytical, verbal, and written communication skills Excellent decision-making, communication, and collaboration skills. Demonstrated capabilities in assessing business needs while providing creative and effective solutions in conformance to emerging technology standards. Tech Stack and Tools REST APIs and Microservices using Java, Spring boot, Cassandra DB Kafka event streaming Cloud Banking platform & Process Orchestrator like Mambu Card Issuing & processing platforms like Marqeta, Pismo etc. React Native GitLab, JIRA, Cloudbees, OpenSearch, Swaggerhub, Snowflake, Talend, AWS Cloud, Spinnaker, CI/CD We make financial services accessible to humans everywhere. Join us for what’s next. Western Union is positioned to become the world’s most accessible financial services company —transforming lives and communities. We’re a diverse and passionate customer-centric team of over 8,000 employees serving 200 countries and territories, reaching customers and receivers around the globe. More than moving money, we design easy-to-use products and services for our digital and physical financial ecosystem that help our customers move forward. Just as we help our global customers prosper, we support our employees in achieving their professional aspirations. You’ll have plenty of opportunities to learn new skills and build a career, as well as receive a great compensation package. If you’re ready to help drive the future of financial services, it’s time for the Western Union. Learn more about our purpose and people at https://careers.westernunion.com/. Benefits You will also have access to short-term incentives, multiple health insurance options, accident and life insurance, and access to best-in-class development platforms, to name a few(https://careers.westernunion.com/global-benefits/). Please see the location-specific benefits below and note that your Recruiter may share additional role-specific benefits during your interview process or in an offer of employment. Your India Specific Benefits Include Employees Provident Fund [EPF] Gratuity Payment Public holidays Annual Leave, Sick leave, Compensatory leave, and Maternity / Paternity leave Annual Health Check up Hospitalization Insurance Coverage (Mediclaim) Group Life Insurance, Group Personal Accident Insurance Coverage, Business Travel Insurance Cab Facility Relocation Benefit Western Union values in-person collaboration, learning, and ideation whenever possible. We believe this creates value through common ways of working and supports the execution of enterprise objectives which will ultimately help us achieve our strategic goals. By connecting face-to-face, we are better able to learn from our peers, problem-solve together, and innovate. Our Hybrid Work Model categorizes each role into one of three categories. Western Union has determined the category of this role to be Hybrid. This is defined as a flexible working arrangement that enables employees to divide their time between working from home and working from an office location. The expectation for Hybrid roles in the Philippines is to work from the office at least 70% of the employee’s working days per month. We are passionate about diversity. Our commitment is to provide an inclusive culture that celebrates the unique backgrounds and perspectives of our global teams while reflecting the communities we serve. We do not discriminate based on race, color, national origin, religion, political affiliation, sex (including pregnancy), sexual orientation, gender identity, age, disability, marital status, or veteran status. The company will provide accommodation to applicants, including those with disabilities, during the recruitment process, following applicable laws. Estimated Job Posting End Date 06-13-2025 This application window is a good-faith estimate of the time that this posting will remain open. This posting will be promptly updated if the deadline is extended or the role is filled. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Position Description Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Microsoft PowerBI Position: LA Experience:7+Years Category: Software Development/ Engineering Main location: Hyderabad Position ID: J0425-0317 Employment Type: Full Time Job Description : Microsoft PowerBI This role involves migrating existing Tableau dashboards and datasets to Power BI. The individual will be responsible for the technical design, development, testing, and deployment of Power BI solutions. They will work closely with stakeholders to ensure a smooth transition and maintain data accuracy and consistency throughout the migration process. Key Responsibilities: Prepare detailed technical specifications documents for each dashboard. Convert Tableau datasets to Power BI datasets, including data modeling and optimization. Develop visuals in Power BI and apply filters. Test for data accuracy against existing Tableau reports. Develop navigation within dashboards and ensure a uniform UI experience across sheets. Publish and test dashboards on the Power BI Service. Provide UAT (User Acceptance Testing) support. Update documentation as needed. Manage the release process, including moving to production. Provide post-production support and hypercare. Collaborate with Talend and BQ support teams. Participate in project management activities, including status reporting and daily stand-ups. Required Skills and Experience: 7+ years of experience and 5+ relevant in Power BI Strong experience with Power BI development and data modeling. Familiarity with Tableau and its functionalities. Understanding of data warehousing concepts and STAR schema modeling. Experience with data validation and testing. Ability to create technical documentation. Excellent communication and collaboration skills. Behavioural Competencies : Proven experience of delivering process efficiencies and improvements Clear and fluent English (both verbal and written) Ability to build and maintain efficient working relationships with remote teams Demonstrate ability to take ownership of and accountability for relevant products and services Ability to plan, prioritise and complete your own work, whilst remaining a team player Willingness to engage with and work in other technologies Note: This job description is a general outline of the responsibilities and qualifications typically associated with the Power BI role. Actual duties and qualifications may vary based on the specific needs of the organization. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodations for people with disabilities in accordance with provincial legislation. Please let us know if you require a reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Your future duties and responsibilities Required Qualifications To Be Successful In This Role Your future duties and responsibilities Required Qualifications To Be Successful In This Role Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Overview Join global organization with 82000+ employees around the world, as a ETL Data Brick Developer role based in IQVIA Bangalore. You will be part of IQVIA’s world class technology team and will be involved in design, development, enhanced software programs or cloud applications or proprietary products. The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2335_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Snowflake Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role Overview: We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Key Responsibilities: Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services, including but not limited to AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others, to build and manage data pipelines. Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Required Qualifications: 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools: Extensive hands-on experience with commercial or open-source ETL tools (Talend) Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Strong knowledge of AWS Redshift for data warehousing and analytics. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing . Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills: Strong understanding of relational databases and NoSQL databases. Version Control: Experience with version control systems (e.g., Git). Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: Certifications in AWS Data Analytics or other relevant areas. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities Data Quality Strategy and Implementation: Engage with clients to understand their data quality requirements and business goals. Develop and implement data quality frameworks and solutions using tool such as Collibra and IDMC. Provide expert advice on industry best practices and emerging trends in data quality management. Tool Expertise: Utilize DQ tools such as Collibra, Talend, IDMC, etc. to manage and enhance data quality processes. Configure and customize Collibra workflows and IDMC data management solutions to meet specific client needs. Ensure seamless integration of data quality tools with existing data governance systems. Monitoring and Continuous Improvement: Establish data quality metrics and KPIs to assess effectiveness and drive continuous improvement. Conduct regular audits and assessments to ensure data quality standards are maintained. Facilitate workshops and training sessions to promote data quality awareness and best practices. Collaboration and Leadership: Work collaboratively with data architects, data analysts, IT, legal, and compliance teams to integrate data quality into broader data management initiatives. Mentor and guide junior team members, fostering a culture of knowledge sharing and professional growth. Mandatory Skill Sets Collibra, Informatica Data Management Cloud (IDMC) Preferred Skill Sets certifications in Collibra and Informatica Years Of Experience Required 4 – 7 yrs Education Qualification B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Collibra Data Governance Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Presidio, Where Teamwork and Innovation Shape the Future At Presidio, we’re at the forefront of a global technology revolution, transforming industries through cutting-edge digital solutions and next-generation AI. We empower businesses—and their customers—to achieve more through innovation, automation, and intelligent insights. The Role Presidio Senior Engineer will be responsible for driving the development of reliable, scalable, and high-performance data systems. This role requires a strong foundation in cloud platforms, data engineering best practices, and data warehousing. The ideal candidate has hands-on experience in building robust ETL/ELT pipelines Responsibilities Include Design, develop, and maintain scalable ETL/ELT data pipelines for batch and real-time data processing. Build and optimise cloud-native data platforms and data warehouses (e.g., Snowflake, Redshift, BigQuery). Design and implement data models, including normalised and dimensional models (star/snowflake schema). Collaborate with cross-functional teams to gather requirements and deliver reliable data solutions. Ensure data quality, consistency, governance, and security across data platforms. Optimise and tune SQL queries and data workflows for performance and cost efficiency. Lead or mentor junior data engineers and contribute to team-level planning and design. Must-Have Qualifications Cloud Expertise: Strong experience with at least one cloud platform (AWS, Azure, or GCP). Programming: Proficiency in Python, SQL, and shell scripting. Data Warehousing & Modeling: Deep understanding of warehousing concepts and best practices. ETL/ELT Pipelines: Proven experience with building pipelines using orchestration tools like Airflow or DBT. Experience with CI/CD tools and version control (Git). Familiarity with distributed data processing and performance optimisation. Good-to-Have Skills Hands-on experience with UI-based ETL tools like Talend, Informatica, or Azure Data Factory. Exposure to visualisation and BI tools such as Power BI, Tableau, or Looker. Knowledge of data governance frameworks and metadata management tools (e.g., Collibra, Alation). Experience in leading data engineering teams or mentoring team members. Understanding of data security, access control, and compliance standards (e.g., GDPR, HIPAA). Your future at Presidio Joining Presidio means stepping into a culture of trailblazers—thinkers, builders, and collaborators—who push the boundaries of what’s possible. With our expertise in AI-driven analytics, cloud solutions, cybersecurity, and next-gen infrastructure, we enable businesses to stay ahead in an ever-evolving digital world. Here, your impact is real. Whether you're harnessing the power of Generative AI, architecting resilient digital ecosystems, or driving data-driven transformation, you’ll be part of a team that is shaping the future. Ready to innovate? Let’s redefine what’s next—together. About Presidio At Presidio, speed and quality meet technology and innovation. Presidio is a trusted ally for organizations across industries with a decades-long history of building traditional IT foundations and deep expertise in AI and automation, security, networking, digital transformation, and cloud computing. Presidio fills gaps, removes hurdles, optimizes costs, and reduces risk. Presidio’s expert technical team develops custom applications, provides managed services, enables actionable data insights and builds forward-thinking solutions that drive strategic outcomes for clients globally. For more information, visit www.presidio.com . Presidio is committed to hiring the most qualified candidates to join our amazing culture. We aim to attract and hire top talent from all backgrounds, including underrepresented and marginalized communities. We encourage women, people of color, people with disabilities, and veterans to apply for open roles at Presidio. Diversity of skills and thought is a key component to our business success. Recruitment Agencies, Please Note: Presidio does not accept unsolicited agency resumes/CVs. Do not forward resumes/CVs to our careers email address, Presidio employees or any other means. Presidio is not responsible for any fees related to unsolicited resumes/CVs. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Tips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall. Responsibilities 8+ years of experience in data engineering, with a minimum of 5 years in a leadership role Proficiency in ETL/ELT processes and experience with ETL tools (Talend, Informatica etc.) Expertise in Snowflake or similar cloud-based data platforms (e.g., Redshift, BigQuery) Strong SQL skills and experience with database tuning, data modeling, and schema design Familiarity with programming languages like Python or Java for data processing Knowledge of data governance and compliance standards Excellent communication and project management skills, with a proven ability to prioritize and manage multiple projects simultaneously Location - Gurgaon 3-4 days work from office Meals & Transport free Qualifications Bachelor's OR Master's Degree in IT or equivalent. Excellent verbal and written communication skills Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills:  Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less

Posted 1 week ago

Apply

10.0 - 19.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Overview We are seeking a Business Intelligence Analyst to join our team. The ideal candidate will be responsible for analyzing complex data sets to provide insights and support strategic decision-making within the organization. Responsibilities Utilize SQL and Python to extract and manipulate data for analysis Collaborate with stakeholders to gather business requirements and translate them into technical solutions Work in an Agile environment to deliver BI solutions efficiently Perform business analysis to identify trends, patterns, and opportunities Design and optimize databases for efficient data storage and retrieval Analyze linked data sources to create comprehensive reports Monitor data quality and integrity, ensuring accurate reporting Skills Proficiency in SQL for data extraction and manipulation Experience with Python for data analysis and automation Familiarity with Talend or similar ETL tools Knowledge of vaticinate or other predictive analytics tools Understanding of Agile methodologies in BI projects Strong business analysis skills to translate requirements into technical solutions Ability to design effective database structures for BI applications Skill in analyzing linked data sources for comprehensive insights Attention to detail and ability to watch data trends closely Job Types: Full-time, Permanent Pay: From ₹40,000.00 per month Benefits: Health insurance Leave encashment Life insurance Schedule: Day shift Supplemental Pay: Yearly bonus Application Question(s): What is your CTC and Notice Period Education: Bachelor's (Required) Work Location: In person

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 3 Lacs

Chennai

On-site

Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. - Job Family Group: Technology - Job Family: Digital Software Engineering - Time Type: - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

12.0 years

5 - 11 Lacs

Noida

On-site

Overall 12+ years of experience working on Databases, Data Warehouse, Data Integration and BI/Reporting solutions with relevant experience in Lifesciences/Pharma domain. Education : BE/B.Tech/Master of Computer Application Technical: Design and implement effective database solutions and data models to store and retrieve data. Hands on experience in the design of reporting schemas, data marts and development of reporting solutions. Prepare scalable database design and architecture in terms of defining multi-tenants’ schemas, data ingestion, data transformation and data aggregation models. Should have expertise and working experience in at least 2 ETL tools among Informatica, SSIS, Talend & Matillion Should have expertise and working experience in at least 2 DBMS/appliances among Redshift, SQL Server, PostgreSQL, Oracle. Should have strong Data Warehousing, Reporting and Data Integration fundamentals. Advanced expertise with SQL Experience on AWS/Azure cloud data stores and it’s DB/DW related service offerings. Should have knowledge and experience of Big Data Technologies (Hadoop ecosystem) and NO SQL databases. Should have technical expertise and working experience in at least 2 Reporting tools among Power BI, Tableau, ,Jaspersoft and QlikView/QlikSense. Advanced technical Competencies in SQL .

Posted 1 week ago

Apply

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Data Migration Architect More than 8 years of experience with data architecture, large-scale data modelling, database design, and business requirements analysis Data Migration expert and it is must skills, - Work along with the Senior Data Lead / Architect to develop the Migration Framework / scripts Responsible for overall data architecture for all areas and domains of the enterprise, including data acquisition, ODS, data warehouse, data provisioning, and ETL Gather and analyse business data requirements and model these needs Expert level understanding of relational database concepts, dimensional database concepts and database architecture and design, ontology, and taxonomy design Experience with using CA Erwin to develop Enterprise Data Models Set standards for data management, analyse current state and conceive desired future state, and conceive projects needed to close the gap between current state and future goals Strong understanding of the best practices in data modelling techniques including in-depth knowledge of the various normalization, dimensional modelling and their appropriate usage in various solutions Provide guidance on technical specifications, data modelling, and reviews proposed data marts, data models, and other dimensional uses of data within the Data Warehouse The Data Architect will oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality Experience and Knowledge of Talend Data Integration Platform Analyse the account structure, contact, pricing and other related objects and to make sure the required data is moved from source system(s) ( Innovative or cForce) to iQuest Map data attribute(s) and create mapping documents as required Create / Write ETL (Extract transform Load) to read data from source and load data to destination - Data Migration Cleanse ( De-dup , etc) and write transformation logic for data transformation Develop error handling strategy to handle exception / missing values along with the data lead and incorporate them into the scripts Develop roll back mechanism for rinse-and -repeat activities Assist in QA and UAT activities Liaison with the Change management teams as required. Assist in QA and UAT activities Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking for Senior Data Migration Specialist with 7+ years of experience in similar role. Roles and responsibilities : • Develop data migration strategies and plans. • Perform ETL operations and data transformation. • Work with cross-functional teams to ensure data consistency and integrity. • Identify and resolve data quality issues. • Document migration processes and best practices. Required Skills: • Expertise in SQL, ETL tools (Informatica, Talend, SSIS, etc.). • Experience in handling large-scale data migrations. • Familiarity with cloud data platforms (AWS, Azure, GCP). Show more Show less

Posted 1 week ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies