Home
Jobs

756 Talend Jobs - Page 24

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12 - 16 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Kinaxis Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Key Responsibilities -Play the integration architect role on Kinaxis implementation projects. -Engage with client stakeholders to understand data requirements, carry out fit-gap analysis and work on overall integration architecture landscape. -Review and analyze data provided by client along with its technical & functional intent and inter dependencies. -Guide integration team to build and deploy effective integration packages to validate and transform customer data for integrated business planning and analytics. Key Qualifications (Must Have)-Minimum 10 to 15 years of experience of implementing ETL solutions to integrate Kinaxis Rapid Response and similar other SCM software in client environment. -Should have played Integration Architect role in at least 2 Kinaxis implementation projects. -Working experience in designing integration packages with ETL tools-Experience of leading technical design for inbound / outbound processes-Advanced understanding of Kinaxis Data models, namespaces and partitions-Advanced understanding of Rapid Response Integration Models (Batch, API, Webservices etc)-Advanced understanding of Rapid Response System Administration-Strong collaborator- team player- and individual contributor. -Strong communication skills with comfort in speaking with business stakeholders. Strong problem solver with ability to manage and lead the team to push the solution and drive progress.Preferable Qualifications (Nice to Have)-Talend experience will be an added advantage-Rapid Response Integration Consultant certification will be a plus-SQL/SSIS experience will be an added advantage Professional Attributes -Proven ability to work creatively and analytically in a problem-solving environment. Proven ability to build, manage and foster a team-oriented e-environment -Desire to work in an information systems environment -Excellent communication written and oral and interpersonal skills. Educational Qualification BTech/BE/MCA Additional Information Open to travel - short / long term Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 5 years

30 - 35 Lacs

Gurugram

Work from Office

Naukri logo

About The Role Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 – Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | Accenture Qualifications Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA

Posted 1 month ago

Apply

3 - 5 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 – Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | Accenture Qualifications Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA

Posted 1 month ago

Apply

3 - 5 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 – Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | Accenture Qualifications Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA

Posted 1 month ago

Apply

3 - 5 years

30 - 35 Lacs

Mumbai

Work from Office

Naukri logo

About The Role Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 – Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everything"”from how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | Accenture Qualifications Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA

Posted 1 month ago

Apply

5 - 9 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Any Stream Summary :As a Data Architect, you will be responsible for defining the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration using SAP Master Data Governance MDG Tool. Roles & Responsibilities: Design and implement data architecture solutions using SAP Master Data Governance MDG Tool. Collaborate with cross-functional teams to understand business requirements and translate them into data architecture solutions. Develop and maintain data models, data dictionaries, and data flow diagrams. Ensure data quality and integrity by implementing data governance policies and procedures. Stay updated with the latest advancements in data architecture and integration technologies. Professional & Technical Skills: Must To Have Skills:Experience in SAP Master Data Governance MDG Tool. Good To Have Skills:Experience in data modeling, data integration, and data governance. Strong understanding of data architecture principles and best practices. Experience with data modeling tools such as ERwin or ER/Studio. Experience with data integration tools such as Informatica or Talend. Solid grasp of data governance policies and procedures. Additional Information: The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data architecture solutions. This position is based at our Bengaluru office. Qualifications BE Any Stream

Posted 1 month ago

Apply

3 - 8 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 3+ years of Strategy Consulting experience at a consulting firm Experience on projects showcasing skills across any two of these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy Desirable to have skills in any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience one or more technologies in the data governance space is preferred:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Mandatory knowledge of IT concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential CDMP Certification from DAMA desirable

Posted 1 month ago

Apply

5 - 9 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring they align with business needs and standards. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. With your creativity and expertise in IBM InfoSphere DataStage, you will play a crucial role in developing efficient and effective applications. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and test applications using IBM InfoSphere DataStage. Collaborate with business analysts and stakeholders to gather requirements. Ensure applications meet business process and application requirements. Troubleshoot and debug applications to resolve issues. Create technical documentation for reference and reporting purposes. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Experience with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Familiarity with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 5 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

8 - 13 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Agile PySpark Data Modelling Matillion Designing technical architecture AWS Data Pipeline

Posted 1 month ago

Apply

2 - 5 years

7 - 11 Lacs

Mumbai

Work from Office

Naukri logo

What you’ll do As a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Schema Design Developing conceptual, logical, and physical data models to support enterprise data requirements. Designing schema structures for Apache Iceberg tables on Cloudera Data Platform. Collaborating with ETL developers and data engineers to optimize data models for efficient ingestion and retrieval. Data Governance & Quality Assurance Ensuring data accuracy, consistency, and integrity across data models. Supporting data lineage and metadata management to enhance data traceability. Implementing naming conventions, data definitions, and standardization in collaboration with governance teams. ETL & Data Pipeline Support Assisting in the migration of data from IIAS to Cloudera Data Lake by designing efficient data structures. Working with Denodo for data virtualization, ensuring optimized data access across multiple sources. Collaborating with teams using Talend Data Quality (DQ) tools to ensure high-quality data in the models. Collaboration & Documentation Working closely with business analysts, architects, and reporting teams to understand data requirements. Maintaining data dictionaries, entity relationships, and technical documentation for data models. Supporting data visualization and analytics teams by designing reporting-friendly data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in data modeling, database design, and data engineering. Hands-on experience with ERwin Data Modeler for creating and managing data models. Strong knowledge of relational databases (PostgreSQL) and big data platforms (Cloudera, Apache Iceberg). Proficiency in SQL and NoSQL database concepts. Understanding of data governance, metadata management, and data security principles. Familiarity with ETL processes and data pipeline optimization. Strong analytical, problem-solving, and documentation skills. Preferred technical and professional experience Experience working on Cloudera migration projects. Exposure to Denodo for data virtualization and Talend DQ for data quality management. Knowledge of Kafka, Airflow, and PySpark for data processing. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. Certifications in Data Modeling, Cloudera Data Engineering, or IBM Data Solutions.

Posted 1 month ago

Apply

2 - 5 years

7 - 11 Lacs

Mumbai

Work from Office

Naukri logo

Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Preferred technical and professional experience Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Familiarity with graph databases (DGraph Enterprise) for data relationships. Experience with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. IBM, Cloudera, or AWS/GCP certifications in Data Engineering or Data Modeling.

Posted 1 month ago

Apply

7 - 12 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to your team members. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will require you to balance technical expertise with effective communication, fostering a collaborative environment that encourages innovation and problem-solving. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Facilitate knowledge sharing sessions to enhance team capabilities. Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data integration processes and methodologies. Experience with data warehousing concepts and practices. Familiarity with SQL and database management systems. Ability to troubleshoot and optimize ETL processes for performance. Additional Information: The candidate should have minimum 7.5 years of experience in Talend ETL. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

18 - 22 years

0 Lacs

Hyderabad, Telangana, India

Hybrid

Linkedin logo

DATAECONOMY is one of the fastest-growing Data & AI company with global presence. We are well-differentiated and are known for our Thought leadership, out-of-the-box products, cutting-edge solutions, accelerators, innovative use cases, and cost-effective service offerings. We offer products and solutions in Cloud, Data Engineering, Data Governance, AI/ML, DevOps and Blockchain to large corporates across the globe. Strategic Partners with AWS, Collibra, cloudera, neo4j, DataRobot, Global IDs, tableau, MuleSoft and Talend. Job Title: Delivery HeadExperience: 18 - 22 YearsLocation: HyderabadNotice Period: Immediate Joiners are preferred Job Summary:We are seeking a seasoned Technical Delivery Manager with deep expertise in Data Engineering and Data Science to lead complex data initiatives and drive successful delivery across cross-functional teams. The ideal candidate brings a blend of strategic thinking, technical leadership, and project execution skills, along with hands-on knowledge of modern data platforms, machine learning, and analytics frameworks. Key Responsibilities:Program & Delivery ManagementOversee end-to-end delivery of large-scale data programs, ensuring alignment with business goals, timelines, and quality standards.Manage cross-functional project teams including data engineers, data scientists, analysts, and DevOps personnel.Ensure agile delivery through structured sprint planning, backlog grooming, and iterative delivery.Technical LeadershipProvide architectural guidance and review of data engineering pipelines and machine learning models.Evaluate and recommend modern data platforms (e.g., Snowflake, Databricks, Azure Data Services, AWS Redshift, GCP BigQuery).Ensure best practices in data governance, quality, and compliance (e.g., GDPR, HIPAA).Stakeholder & Client ManagementAct as the primary point of contact for technical discussions with clients, business stakeholders, and executive leadership.Translate complex data requirements into actionable project plans.Present technical roadmaps and delivery status to stakeholders and C-level executives.Team Development & MentoringLead, mentor, and grow a high-performing team of data professionals.Conduct code and design reviews; promote innovation and continuous improvement. Key Skills and Qualifications:Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field.18–22 years of total IT experience with at least 8–10 years in data engineering, analytics, or data science.Proven experience delivering enterprise-scale data platforms, including:ETL/ELT pipelines using tools like Apache Spark, Airflow, Kafka, Talend, or Informatica.Data warehouse and lake architectures (e.g., Snowflake, Azure Synapse, AWS Redshift, Delta Lake).Machine Learning lifecycle management (e.g., model training, deployment, MLOps using MLflow, SageMaker, or Vertex AI).Strong knowledge of cloud platforms (Azure, AWS, or GCP).Deep understanding of Agile, Scrum, and DevOps principles.Excellent problem-solving, communication, and leadership skills. Preferred Certifications (Optional but Beneficial):PMP, SAFe Agile, or similar project management certifications.Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Azure Data Engineer Associate).Certified Scrum Master (CSM) or equivalent.

Posted 1 month ago

Apply

4 - 6 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: ETL Developer – Python & SQLExperience: 4 to 6 YearsLocation: Chennai , BangaloreEmployment Type: Full-TimeNotice Period : Immediate to 20 Days Job Summary:We are looking for an experienced ETL Developer with strong expertise in Python, SQL, and ETL processes to join our data engineering team in Chennai. The ideal candidate will design, develop, and maintain efficient data pipelines and contribute to end-to-end data solutions for various business needs. Key Responsibilities:Design, build, and optimize scalable ETL pipelines using Python and SQL.Work with large datasets across various data sources, including structured and unstructured formats.Analyze and understand complex data systems and flows to support reporting, analytics, and integration requirements.Collaborate with data architects, business analysts, and other stakeholders to ensure high data quality and performance.Monitor, troubleshoot, and resolve data pipeline and integration issues.Create and maintain detailed documentation for processes and data flows. Required Skills:4–6 years of hands-on experience in Python programming.Strong expertise in writing complex SQL queries and stored procedures.Experience with ETL tools and frameworks (e.g., Apache Airflow, Talend, or custom ETL frameworks).Familiarity with data warehousing concepts and cloud-based data platforms is a plus.Good understanding of data structures, data modeling, and performance tuning.Experience in Agile/Scrum-based development environments. Preferred Qualifications:Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.Experience with cloud platforms such as AWS, GCP, or Azure is a plus.Knowledge of version control tools like Git.Strong problem-solving and communication skills.

Posted 1 month ago

Apply

1 - 2 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Junior Data Engineer (Immediate Joiner Only) Location: Remote Working Hours: 5 PM – 2 AM IST shift MUST be ready to join in 15 days notice Max MUST have good comm skills MUST have experience between 1-2 years MAX, not more than that Pacific Data Integrators (PDI) is currently hiring JUNIOR ETL / DATA ENGINEER (GURUGRAM) for its enterprise client projects globally. The opportunity invites Entry level candidates with excellent academic record, who wants to make a great career in Data Engineering field.Do you love turning raw data into actionable insights? Are you passionate about building data pipelines that fuel innovation? At PDI, we empower businesses to unlock the full potential of their data. As a Junior Data Engineer, you'll play a crucial role in this mission by transforming raw data into valuable information that drives our two core pillars: Insight & Prediction, and Diagnostic & Description. What you'll do:Build and maintain data pipelines: You'll design, develop, and optimize data pipelines that extract, transform, and load data from various client systems into our data lake and data marts.Ensure data quality and reliability: You'll implement processes and tools to guarantee the accuracy, completeness, and consistency of our data.Collaborate with a dynamic team: You'll work closely with data scientists, architects, and client tech teams to deliver impactful data solutions.Contribute to cutting-edge projects: You'll be involved in building and maintaining data infrastructure that supports predictive modeling, descriptive analytics, and other data-driven initiatives.Stay ahead of the curve: You'll continuously learn and explore new technologies in the ever-evolving world of data engineering. What you'll bring:Foundation in data engineering: You have a good understanding of data warehousing principles, ETL processes, and data modeling techniques.Programming proficiency: You're comfortable coding in Java and Python, and you have hands-on experience with SQL database design.Cloud experience: You've worked with at least one major cloud platform (AWS, GCP, or Azure).Collaborative spirit: You thrive in a team environment and enjoy working with diverse stakeholders.Problem-solving mindset: You're a creative thinker who can identify and solve complex data challenges.Eagerness to learn: You're passionate about data and eager to expand your knowledge and skills. Bonus points:Experience with ETL tools / technologies (e.g., Informatica, Talend, Snowflake)Familiarity with data visualization tools Why PDI?Make a real impact: Your work will directly contribute to the success of our clients and help them achieve their business goals.Work with cutting-edge technology: You'll have the opportunity to use the latest tools and technologies in the data engineering field.Grow your career: We offer a supportive and collaborative environment where you can learn and develop your skills.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu

Work from Office

Indeed logo

1. Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

Work from Office

Indeed logo

P1,C3,STS 1. Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply

8 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Senior Data EngineerJob Location: - Hyderabad (Work from Office)Experience Level: Mid-Senior (8+ Years) About Kellton:We are a global IT services and digital product design and development company with subsidiaries that serve startup, mid-market, and enterprise clients across diverse industries, including Finance, Healthcare, Manufacturing, Retail, Government, and Nonprofits. At Kellton, we believe that our people are our greatest asset. We are committed to fostering a culture of collaboration, innovation, and continuous learning. Our core values include integrity, customer focus, teamwork, and excellence. To learn more about our organization, please visit us at www.kellton.comAre you craving a dynamic and autonomous work environment? If so, this opportunity may be just what you're looking for. At our company, we value your critical thinking skills and encourage your input and creative ideas to supply the best talent available. To boost your productivity, we provide a comprehensive suite of IT tools and practices backed by an experienced team to work with.What you will do:Provide Expertise and Guidance as a Senior Experienced Engineer in solution design and strategy for Data Lake and analytics-oriented Data Operations.Design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in hypervendor platforms like AWS and Azure.Architect and implement Integration, ETL, and data movement solutions using AWS Glue, MSK and/or Confluent, and other COTS technologies.Prepare documentation and designs for data solutions and applications.Design and implement distributed analytics platforms for analyst teams.Design and implement streaming solutions using Amazon Kinesis, Kafka, Talend, and Confluent.Migrate data from traditional relational database systems to AWS relational databases such as Amazon RDS, Aurora, Redshift, DynamoDB, Cloudera, Snowflake, Databricks, etc.Implement ad-hoc analysis solutions using Athena, and other SQL and No-SQL Tools like Alteryx.Propose architectures that consider cost/spend in AWS and develop recommendations or plans to right-size AWS data infrastructure. Who you are:Bachelor's degree in Computer Science, Software Engineering.8+ Years of experience in the Data domain as an Engineer and Architect.A solid understanding of AWS and Azure storage solutions such as S3, EFS, and EBS.A solid understanding of AWS and Azure compute solutions such as EC2.Experience implementing solutions on AWS and Azure relational databases such as MSSQL, SSIS, Amazon Redshift, RDS, and Aurora.Experience implementing solutions leveraging ElastiCache and DynamoDB.Experience designing and implementing Enterprise Data Warehouse, Data Marts/Lakes.Experience with Star or Snowflake Schema.Experience with R and Python and other emerging technologies in D&A.Understanding of Slowly Changing Dimensions and Data Vault Model.AWS and Azure Certifications are preferred. What we offer you:· Existing clients in multiple domains to work.· Strong and efficient team committed to quality output.· Enhance your knowledge and gain industry domain expertise by working in varied roles.· A team of experienced, fun, and collaborative colleagues· Hybrid work arrangement for flexibility and work-life balance (If the client/project allows)· Competitive base salary and job satisfaction. Join our team and become part of an exciting company where your expertise and ideas are valued, and where you can make a significant impact in the IT industry. Apply today! Interested applicants, please submit your detailed resume stating your current and expected compensation and notice period to victoria.esther@kellton.com or srahaman@kellton.com

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Hyderabad, Telangana, India

Hybrid

Linkedin logo

Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay And Benefits Competitive compensation, including base pay and annual incentiveComprehensive health and life insurance and well-being benefits, based on locationPension / Retirement benefitsPaid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role Seeking a skilled Talend Developer with expertise in Power BI development and SQL Server to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes using Talend, creating insightful data visualizations with Power BI, and is an expert in writing stored procedures/queries on MS SQL Server databases. What You'll Do Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources.Create and maintain data visualizations and dashboards using Power BI to provide actionable insights to stakeholders.Write high performance queries on SQL Server databases, ensuring data integrity, performance, and security Collaborate with cross-functional teams to gather requirements, design solutions, and implement data integration and reporting solutions.Troubleshoot and resolve issues related to ETL processes, data visualizations, and database performance Collaborates with other team members and analysts through the delivery cycle. Participates in an Agile delivery team that builds high quality and scalable work products. Supports production releases and maintenance windows working with the Operations team Qualifications Bachelor’s degree in computer science, Information Technology, or a related field. Talents Needed For Success Min 3+ years in writing ETL processes Proven experience as a Talend Developer, with a strong understanding of ETL processes and data integration. Proficiency in Power BI development, including creating dashboards, reports, and data models. Expertise in SQL Server, including database design, optimization, and performance tuning Strong understanding of agile processes (Kanban and Scrum) and a working knowledge of JIRA is required Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Qualifications Needed For Success Talend Expertise: Proficiency in using Talend Studio for data integration, data quality, files manipulation. This includes designing and developing ETL processes, creating and managing Talend jobs, and using Talend components for data transformation and integrationData Integration Knowledge in Talend : Understanding of data integration concepts and best practices. This includes experience with data extraction, transformation, and loading (ETL) processes, as well as knowledge of data warehousing and data modeling. Database Skills: Proficiency in working with various databases, including MS SQL and/or Oracle databases. This includes writing complex SQL queries, understanding database schemas, and performing data migrations.Version Control and Collaboration: Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). This is important for managing code changes, collaborating with team members, and tracking project progress.Job Scheduling and Automation: Experience with job scheduling and automation tools. This includes setting up and managing Talend jobs using schedulers like Talend Administration Center (TAC), Autosys or third-party tools to automate ETL workflows.Data Visualization: Ability to create visually appealing and insightful reports and dashboards. This involves selecting appropriate visualizations, designing layouts, and using custom visuals when necessary using Power BIPower Query: Expertise in using Power Query for data transformation and preparation. This involves cleaning, merging, and shaping data from various source Expertise in scripting languages such as Python, and Shell/Batch programming is a plus We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

3 - 8 years

4 - 9 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Skills: Elasticsearch, Talend, Grafana Responsibilities: Build dashboards, manage clusters, optimize performance Tech: API, Python, cloud platforms (AWS, Azure, GCP) Preference: Immediate joiners Contact: 6383826448 || jensyofficial23@gmail.com

Posted 1 month ago

Apply

2 - 6 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 1 month ago

Apply

4 - 9 years

10 - 20 Lacs

Noida

Hybrid

Naukri logo

• Monitor and maintain production data pipelines, ETL/ELT workflows, and batch processing jobs in Talend. • Analyze and fix stored procedures and data flows to troubleshoot and resolve production incidents, ensuring minimal downtime and accurate data delivery. • Implement alerting and monitoring solutions to proactively detect and resolve data quality or performance issues. • Collaborate with Data Engineers, BI Engineers, and Architects to ensure smooth deployments and updates. • Perform root cause analysis (RCA) for recurring issues and drive long-term solutions. • Develop and maintain runbooks, documentation, and knowledge bases for data operations and incident response. • Automate repetitive support tasks and improve operational efficiency through scripting and tooling. • Provide production support coverage on weekends, as part of a shift-based role. • Ensure adherence to data governance, privacy, and security standards in all production support activities. Interested candidates can share their resume at megha.rawat@puresoftware.com

Posted 1 month ago

Apply

1 years

2 - 4 Lacs

Bengaluru, Karnataka

Work from Office

Indeed logo

Designation – Data Ops Engineer Location : Bengaluru Responsibilities: Experience with the system development lifecycle, such as gathering and refining requirements, data analysing and data profiling, designing and implementing solution, handling sensitive data sets, production deployment, troubleshooting and data conversion, etc. Experience in developing ETL solutions for Data Marts, Data warehousing and Operational Data Store (ODS) on any of the databases or Hadoop environments. At least 1 year of working experience on Streamsets platform tools. At least 1 year of experience on Snowflake and AWS related technologies. Good to have Experience on Talend Administration (TMC). Experience working in an Agile Environment. Proficiency in Python development for data processing, automation, and scripting tasks. Strong Experience with various database development, SQL, and PL/SQL, etc. Proficiency in Linux and shell scripts, job scheduling Products development experience is a big plus especially with focus on Data Warehousing benchmarking efforts Good organizational, planning and project management skills- exhibiting attention to detail accountability and follow through Should have lead a medium to large sized team and independently handled complex projects Bring in design thinking to conceptualize original ideas and transform them in. Responsible for identifying the performance bottlenecks and providing the solution for implementing the same. Should have strong analytical skills Qualifications Bachelor’s degree in Computer Science or a related technical discipline. Advanced degree in management is a plus. Job Types: Full-time, Permanent Pay: ₹209,974.79 - ₹400,000.00 per year Benefits: Health insurance Provident Fund Schedule: Monday to Friday Experience: Databases: 1 year (Required) Work Location: In person Application Deadline: 11/05/2025

Posted 1 month ago

Apply

1 - 3 years

0 Lacs

Kurla, Maharashtra, India

On-site

Linkedin logo

Job Opening: PostgreSQL DeveloperLocation: Kurla,Mumbiai (Onsite)Experience Required: 1 to 3 yearsEmployment Type: Full-timeWe are seeking a talented and detail-oriented PostgreSQL Developer with 1 to 3 years of experience. The ideal candidate should have the ability to write ETL (Extract, Transform, Load) jobs and execute basic to intermediate SQL queries efficiently.Key Responsibilities:Design, develop, and maintain ETL processes.Write, test, and optimize SQL queries for data extraction and reporting.Collaborate with data analysts, developers, and other stakeholders to ensure accurate data flow and integration.Assist in troubleshooting database-related issues and support performance tuning.Requirements:1 to 3 years of hands-on experience with PostgreSQL.Strong understanding of SQL and relational database concepts.Experience in writing and managing ETL workflows.Ability to work independently and collaboratively in a team environment.Preferred Qualifications:Familiarity with ETL tools such as Apache Airflow, Talend, or similar.Basic knowledge of data warehousing concepts.

Posted 1 month ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

PharmaACE is a growing Global Healthcare Consulting Firm, headquartered in Princeton, New Jersey. Our expert teams of Business Analysts, based across the US, Canada, Europe, and India, provide Analytics and Business Solutions using our worldwide delivery models for a wide range of clients. Our clients include established, multinational BioPharma leaders and innovators, as well as entrepreneurial firms on the cutting edge of science. We have deep expertise in Forecasting, Business Analytics, Competitive Intelligence, Sales Analytics, and the Analytics Centre of Excellence Model. Our wealth of therapeutic area experience cuts across Oncology, Immuno- science, CNS, CV-Met, and Rare Diseases. We support our clients' needs in Primary Care, Specialty Care, and Hospital business units, and we have managed portfolios in the Biologics space, Branded Pharmaceuticals, Generics, APIs, Diagnostics, and Packaging & Delivery Systems. Responsibilities: • Working closely with Business teams/stakeholders across the pharmaceutical value chain and developing reports and dashboards that tell a “story”. • Recommending KPIs and helping generate custom analysis and insights. • Propose newer visualization ideas for our customers, considering the audience type. • Designing Tableau dashboards and reports that are self-explanatory. • Keep the user at the “center” while designing the reports and thereby enhancing the user experience • Requirement gathering while working closely with our Global Clients. • Mentor other developers on the team on Tableau-related technical challenges. • Propagate Tableau best practices within and across the team. • Ability to set up reports that can be maintained with ease and are scalable to other use cases. • Interacting with the AI/ML team and incorporating new ideas into the final deliverables for the client. • Work closely with cross teams like Advanced Analytics and Competitive Intelligence and Forecasting. • Develop and foster client relationships and serve as a point of contact for projects. Qualifications and Areas of Expertise: • Educational Qualification: BE/BTech/MTech/MCA from a reputed institute. • Minimum 3-5 years of experience. • Proficient with tools including Tableau Desktop, Tableau Server, MySQL, MS Excel, and ETL tools (Alteryx, Tableau Prep, or Talend). • Knowledge of SQL.• Experience in advanced LOD calcs, custom visualizations, data cleaning, and restructuring. • Strong analytical and problem-solving skills with the ability to question facts. • Excellent written and oral communication skills. Nice to have: • A Valid U.S. business Visa. • Hands-on experience in Tableau, Python, and R. • Hands-on experience with Qlik Sense and Power BI. • Experience with Pharma / Healthcare data.

Posted 1 month ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies