Jobs
Interviews

2471 Data Integration Jobs - Page 46

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

IICS Developer2 Job Overview :We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies. Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Role: Senior Functional Consultant (Odoo ERP) Job Location: Bangalore, India (Should be willing to travel across Asia, Middle East and Africa for project implementation) CV Submission email: Job Summary: We are seeking a well-experienced Senior Functional Consultant Odoo (Finance & Accounting) to lead and support end-to-end implementation of ERP solution having requirement gathering and study, designing, development, setup and configuration, testing and optimization of Odoo ERP financial modules. The ideal candidate should have a strong background in finance and accounting processes in public sector, hands-on expertise in Odoo, and experience working with clients across various industries. Key Responsibilities: Gather and analyse client requirements related to finance, accounting, and reporting processes. Design, develop, configure, and implement Odoo Finance modules and related areas such as: o General Ledger and consolidation o Accounts Payable & Receivable o Bank Reconciliation o Cash Management o Commitment management o Fixed Assets o Budgeting & Cost Center Accounting o Chart of Account structure understanding o Taxation (VAT, GST, etc.) Map client business processes and provide best-practice solutions leveraging Odoo functionality. Prepare functional documentation, including BRDs, FRDs, and test cases. Preparation of training materials and user manuals for the modules as per the process Conduct training to the users at different level including advanced trainings for ToT Conduct UAT (User Acceptance Testing), training, and go-live support. Collaborate closely with technical teams for custom development and integrations. Lead implementation projects or modules independently with minimal supervision. Ensure timely project delivery and adherence to quality standards. Supporting the users post deployment and enable other consultants and users on the system. Provide post-implementation support and continuous improvements for existing clients. Required Skills & Qualifications: Bachelors degree or masters degree in accounting, Finance, or Information Systems; is preferred 5-8 years of experience in ERP consulting, with a strong focus on Odoo Finance modules. Strong understanding of accounting principles and business processes. Experience in Odoo, including community and enterprise editions. Ability to translate business requirements into functional specifications. Proven experience with multi-company, multi-currency, and tax compliance setups. Strong client-facing and communication skills. Ability to manage multiple projects and stakeholders. Familiarity with SQL, reporting tools, and technical aspects of Odoo (basic level). Experience in handling international clients and end-to-end implementations across global locations. Willingness to travel or be based in a foreign country for long-term deployments and assignments. Preferred: Odoo certification Experience in data migration and integration with third-party apps Exposure to other Odoo modules (e.g.: Inventory, Sales, Purchasing) is a plus Experience in international project delivery or cross-border finance operations

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

We are seeking a skilled and motivated Data Engineer with hands-on experience in Snowflake , Azure Data Factory (ADF) , and Fivetran . The ideal candidate will be responsible for building and optimizing data pipelines, ensuring efficient data integration and transformation to support analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using Fivetran , ADF , and other ETL tools. Build and manage scalable data models and data warehouses on Snowflake . Integrate data from various sources into Snowflake using automated workflows. Implement data transformation and cleansing processes to ensure data quality and integrity. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Monitor pipeline performance, troubleshoot issues, and optimize for efficiency. Maintain documentation related to data architecture, processes, and workflows. Ensure data security and compliance with company policies and industry standards. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency with Snowflake including architecture, SQL scripting, and performance tuning. Hands-on experience with Azure Data Factory (ADF) for pipeline orchestration and data integration. Experience with Fivetran or similar ELT/ETL automation tools. Strong SQL skills and familiarity with data warehousing best practices. Knowledge of cloud platforms, preferably Microsoft Azure. Familiarity with version control tools (e.g., Git) and CI/CD practices. Excellent communication and problem-solving skills. Preferred Qualifications: Experience with Python, dbt, or other data transformation tools. Understanding of data governance, data quality, and compliance frameworks. Knowledge of additional data tools (e.g., Power BI, Databricks, Kafka) is a plus.

Posted 1 month ago

Apply

9.0 - 14.0 years

25 - 30 Lacs

Gurugram

Work from Office

Reports To Associate Director - Risk Data Analytics Level Level 5 About your team The Global Risk team in Fidelity covers the management oversight of Fidelitys risk profile, including key risk frameworks, policies and procedures and oversight and challenge processes. The team partner with the businesses to ensure Fidelity manages its risk profile within defined risk appetite. The team comprises risk specialists covering all facets of risk management, including investment, financial, non-financial and strategic risk. As part of a broader General Counsel team, the Risk team collaborates closely with Compliance, Legal, Tax and Corporate Sustainability colleagues. Develop efficient data driven solutions to support SMEs take key decisions for oversights & monitoring. Keep up with the pace of change in field of Data Analytics using cloud driven technology stack. Work on diverse risk subject areas. About your role The successful candidate will be responsible for data analysis, visualisation, and reporting for the Global Risk business. This role encompasses the full spectrum of data analysis, data modelling, technical design, and the development of enterprise-level analytics and insights using tools such as Power BI. Additionally, the candidate will provide operational support. Strong relationship management and stakeholder management skills are essential to maintain superior service for our various business contacts and clients. This role is for a Visualization & Reporting expert who can understand various risk domains such as Investment Risk, Non-Financial Risk, Enterprise Risk, and Strategic Risk, as well as complex risk frameworks and business issues. The candidate must comprehend the functional and technical implications associated with delivering analytics capabilities using various data sources and the Power Platform. This role demands strong hands-on skills in data modelling and transformation using SQL queries and Power Query/DAX, along with expert data visualization and reporting abilities. The successful candidate should be able to handle complex project requirements within agreed timelines while maintaining a high level of deliverable quality. Additionally, they will be expected to interact with stakeholders at all levels of the business, seeking approval and signoffs on project deliverables. Key Responsibilities Understand the scope of business requirements and translate them into stories, define data ingestion approach, data transformation strategy, data model, and front-end design (UI/UX) for the required product. Create working prototypes in tools like Excel or Power BI and reach an agreement with business stakeholders before commencing development to ensure engagement. Drive the data modelling and data visualization development from start to finish, keeping various stakeholders informed and obtaining approvals/signoffs on known issues, solution design, and risks. Work closely with Python Developers to develop data adaptors for ingesting, transforming and retaining time series data as required for frontend. Demonstrate a high degree of proficiency in Power Query, Power BI, advanced DAX calculations and modelling techniques, and developing intuitive visualization solutions. Possess strong experience in developing and managing dimensional data models in Power BI or within a data warehouse environment. Show proficiency in data integration and architecture, including dimensional data modelling, database design, data warehousing, ETL development, and query performance tuning. Advanced data modelling and testing skills using various RDBMS (SQL Server 2017+, Oracle 12C+) and Snowflake data warehouse will be an added advantage. Assess and ensure that the solution being delivered is fit for purpose, efficient, and scalable, refining iteratively if required. Collaborate with global teams and stakeholders to deliver the scope of the project. Obtain agreement on delivered visuals and solutions, ensuring they meet all business requirements. Work collaboratively with the project manager within the team to identify, define, and clarify the scope and terms of complex data visualization requirements. Converting raw data into meaningful insights through interactive and easy-to-understand dashboards and reports. Coordinate across multiple project teams delivering common, reusable functionality using service-oriented patterns. Drive user acceptance testing with the product owner, addressing defects, and improving solutions based on observations. Interact and work with third-party vendors and suppliers for vendor products and in cases of market data integration. Build and contribute towards professional data visualization capabilities within risk teams and at the organization level. Stay abreast of key emerging products industry standards in the data visualization and advance analytics. Co-work with other team members for both relationship management and fund promotion. About you Experience 9+ years of experience in developing and implementing advance analytics solutions. Competencies Ability to identify & self-manage analysis work for the allocated workstream with minimal or no assistance. Ability to develop and maintain strong relationships with stakeholders within project working group ensuring continual and effective communication. Ability to translate business requirements to technical requirements (internal and external) in supporting the project. Excellent interpersonal, communication, documentation, facilitation & presentation skills. Fair idea of Agile methodology, familiar with Stories requirements artefact used in Agile. Excellent written and verbal communication skills and a strong team player. Good communication, influencing, negotiation skills. Proven ability to work well under pressure and in a team environment. Self-motivated, flexible, responsible, and a penchant for quality. Experience based domain knowledge of Risk management, regulatory compliance or operational compliance functions would be an advantage. Basic knowledge and know-how of Data Science and Artificial Intelligence/GenAI. Qualifications Preferred academic qualification is BE B-Tech MCA Any Graduate

Posted 1 month ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Bengaluru

Work from Office

In this position you will be responsible for providing insights to clients. To do so, you will first meet with our clients to uncover their business needs and challenges. Then, you will use your strong analytical skills to perform quantitative and observational data analyses. From these analyses, you will form and present your recommendations to our clients. Title: Kinaxis Consultnat Location: Bangalore, India 4-6 years of experience Should have good understanding of integration with other systems (SAP preferred) Should have worked in 1 to 2 implementation projects or should possess expertise handling minor projects/enhancements. Should have expertise in developing workbooks using advanced features like composite workbooks, Scorecard and Dashboards Should have worked in defining alerts, automation chains, scheduled tasks. Knowledge on Scripting and interfacing with external application using WebAPI Should be able to monitor/handle issues related to data integration and be able to debug issues due to configurations. Should be a team player who is open to work with the support team occasionally for resolving complex/ageing tickets in addition to the project/Enhancement work. Knowledge and Experience in SAP PP module is advantage Should have excellent written and oral communication skills. Candidate should be able to present his/her ideas or Proof of concepts to a large audience. Should be Self driven/task oriented individual who is capable of completing the tasks with no/minimal supervision.

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Pune

Work from Office

We are looking for a highly skilled and experienced Data Engineer with over 5 years of experience to join our growing data team. The ideal candidate will be proficient in Databricks, Python, PySpark, and Azure, and have hands-on experience with Delta Live Tables. In this role, you will be responsible for developing, maintaining, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. You will collaborate with cross-functional teams to build robust data infrastructure and enable data-driven decision-making. Key Responsibilities: .Design, develop, and manage scalable and efficient data pipelines using PySpark and Databricks .Build and optimize Spark jobs for processing large volumes of structured and unstructured data .Integrate data from multiple sources into data lakes and data warehouses on Azure cloud .Develop and manage Delta Live Tables for real-time and batch data processing .Collaborate with data scientists, analysts, and business teams to ensure data availability and quality .Ensure adherence to best practices in data governance, security, and compliance .Monitor, troubleshoot, and optimize data workflows and ETL processes .Maintain up-to-date technical documentation for data pipelines and infrastructure components Qualifications: 5+ years of hands-on experience in Databricks platform development. Proven expertise in Delta Lake and Delta Live Tables. Strong SQL and Python/Scala programming skills. Experience with cloud platforms such as Azure, AWS, or GCP (preferably Azure). Familiarity with data modeling and data warehousing concepts.

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Pune

Work from Office

Role Overview : As a Senior Data Analyst at Hevo, you will leverage your SQL skills and analytical expertise to manage, process, and report data, driving insights across the organization. You will focus on reporting, forecasting, and presenting key metrics to business leaders while collaborating with stakeholders to support strategic decision-making. Key Responsibilities Query large datasets using SQL to extract and manipulate data. Maintain and optimize databases on the data warehouse. Prepare and present weekly business reviews (WBRs), forecasts, and track key metrics. Drive analytics projects related to customer funnels and lead acquisition, uncover insights, and report findings to leadership. Collaborate with cross-functional teams to execute WBRs and track follow-up actions. Lead and manage end-to-end analytics projects with minimal oversight and mentor junior team members. Continuously challenge and improve metrics by aligning them with industry standards. What are we looking for 3-6 years of experience in a quantitative analyst role (preferably in B2B SaaS, growth analytics, or revenue operations). Proficiency in SQL and experience working with large datasets. Experience using Tableau, Looker, or similar tools to create dashboards and report insights. Strong communication skills, with the ability to present data to both technical and non-technical audiences. Bonus: Experience with executive or rev ops reporting. Ability to manage multiple projects simultaneously and drive deliverables with minimal oversight. Key elements needed to succeed in this role Attention to detail Diagnosing the problem Continuous learning mindset Ability to solve complex, open-ended problems

Posted 1 month ago

Apply

6.0 - 10.0 years

4 - 8 Lacs

Noida

Work from Office

We are looking for a skilled Boomi Data Modeller with 6 to 10 years of experience to join our team in [location to be specified]. The ideal candidate will have expertise in data modeling and integration, with hands-on experience in designing and implementing data models using Boomi. Roles and Responsibility Design and develop data models using Boomi, ensuring data integrity and consistency. Collaborate with cross-functional teams to identify business requirements and design solutions. Develop and maintain technical documentation for data models and integrations. Troubleshoot and resolve issues related to data modeling and integration. Ensure compliance with industry standards and best practices for data management. Participate in code reviews and contribute to the improvement of the overall code quality. Job Strong understanding of data modeling concepts and principles. Proficiency in designing and implementing data models using Boomi. Experience with data integration and migration projects. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 1 month ago

Apply

12.0 - 17.0 years

9 - 13 Lacs

Noida

Work from Office

We are looking for a skilled Technical Lead with 12 to 17 years of experience to lead our team in designing and delivering innovative solutions for our manufacturing clients. The ideal candidate will have a strong background in data integration and cloud platforms, with expertise in Cloud Data Quality, Data Integration, and Cloud Data Console. Roles and Responsibility Design and deliver high-quality solutions for manufacturing clients using cloud-based technologies. Lead a team of 15 members, providing guidance on solution design and delivery governance. Collaborate with stakeholders to align technical solutions with business objectives. Develop and maintain architecture, ensuring scalability and reliability. Oversee the delivery process, identifying and mitigating potential risks. Ensure compliance with industry standards and best practices. Job Strong expertise in Cloud Data Quality, Data Integration, and Cloud Data Console. Experience leading a team of engineers, focusing on solution design and delivery governance. Hands-on role requiring responsibility for architecture, delivery governance, and stakeholder alignment. Strong understanding of cloud-based technologies and their applications in manufacturing. Excellent communication and leadership skills, with the ability to motivate and guide a team. Ability to work in a fast-paced environment, prioritizing multiple tasks and meeting deadlines. Preference for candidates based in Tier 1 cities. Contract duration6-12 months (extendable).

Posted 1 month ago

Apply

1.0 - 3.0 years

5 - 9 Lacs

Noida

Work from Office

We are looking for a skilled SAS DI Support Analyst with strong experience in SAS Data Integration (DI) Studio, Base SAS, and SQL to join our data engineering and support team. The ideal candidate will have 1 to 3 years of experience and be based in Mumbai. Roles and Responsibility Provide production support for existing SAS DI jobs and data workflows. Perform root cause analysis of job failures and resolve performance or data quality issues. Maintain and enhance Base SAS and SQL code to ensure optimal data processing and reporting. Monitor data pipelines and ensure timely and accurate data loads. Collaborate with data analysts, developers, and business stakeholders to gather requirements and resolve support issues. Document technical processes, job flows, and solutions for support continuity. Job Strong hands-on experience with SAS DI Studio. Proficiency in Base SAS programming and SQL (including complex joins, subqueries, and performance tuning). Experience in supporting and maintaining ETL workflows and resolving job failures. Knowledge of data warehousing concepts and data integration best practices. Working knowledge of Linux/Unix, including basic commands and shell scripting. Familiarity with job scheduling tools like Control-M, Autosys, or similar. Exposure to version control systems (e.g., Git, SVN).

Posted 1 month ago

Apply

3.0 - 4.0 years

4 - 8 Lacs

Noida

Work from Office

We are looking for a skilled Informatica IDMC professional with 3 to 4 years of experience. The ideal candidate will have a strong background in data integration and management. Roles and Responsibility Define and develop data pipelines using Informatica IDMC. Configure mappings and transformations using CDI & CDQ. Design and develop CAI processes to publish data from source to target. Apply business logic to transform data and configure Service and Application connectors. Support and maintain MDM applications. Collaborate with cross-functional teams to ensure seamless data flow. Job Minimum 3 years of experience in Informatica IDMC. Strong knowledge of data integration and management concepts. Experience with CDI, CDQ, and CAI technologies. Ability to apply business logic to transform data. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. A graduate degree is required for this position.

Posted 1 month ago

Apply

8.0 - 12.0 years

11 - 15 Lacs

Noida

Work from Office

We are looking for a skilled Reltio Architect with 8 to 12 years of experience to lead the design and implementation of enterprise-level MDM solutions using the Reltio Cloud platform. This position is based in Ranchi and Noida. Roles and Responsibility Lead the design and architecture of Reltio-based MDM solutions for large-scale enterprise systems. Collaborate with data governance, analytics, and business teams to define data domains and governance policies. Define data models, match rules, survivorship, hierarchies, and integration strategies. Provide technical leadership for Reltio implementations including upgrades, optimizations, and scaling. Conduct solution reviews and troubleshoot complex data integration or performance issues. Mentor developers and ensure technical deliverables meet architectural standards. Job Minimum 8 years of experience in MDM, with at least 3+ years in Reltio Cloud MDM. Expertise in Reltio data modeling, workflow design, integration strategy, match/merge, and hierarchy management. Experience designing large-scale Reltio implementations across multiple domains. Hands-on experience with Reltio APIs, Reltio Integration Hub, and Informatica/IICS. Strong background in enterprise architecture, data strategy, and cloud platforms (AWS/GCP/Azure). Strong problem-solving, leadership, and communication skills.

Posted 1 month ago

Apply

10.0 - 15.0 years

3 - 7 Lacs

Bengaluru

Work from Office

We are looking for a skilled MDM Engineer with extensive experience in Informatica MDM to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our Master Data Management (MDM) solutions to ensure data accuracy, consistency, and reliability across the organization. This role requires 10-15 years of experience. Roles and Responsibility Design and implement MDM solutions using Informatica MDM, ensuring alignment with business requirements and data governance standards. Develop and manage ETL processes to integrate data from various sources into the MDM system. Implement data quality rules and processes to ensure the accuracy and consistency of master data. Configure Informatica MDM Hub, including data modeling, data mappings, match and merge rules, and user exits. Monitor and optimize the performance of MDM solutions, ensuring high availability and reliability. Collaborate with data stewards, business analysts, and other stakeholders to gather requirements and ensure the MDM solution meets their needs. Create and maintain comprehensive documentation for MDM processes, configurations, and best practices. Troubleshoot issues related to MDM processes and systems. Job Minimum 10 years of hands-on experience in MDM design, development, and support using Informatica MDM. Proficiency in Informatica MDM ETL processes and data integration technologies. Strong understanding of data governance, data quality, and master data management principles. Excellent problem-solving and analytical skills with the ability to troubleshoot complex data issues. Strong communication and interpersonal skills with the ability to collaborate effectively with cross-functional teams. Experience in Employment Firms/Recruitment Services Firms industry is preferred.

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 7 , jd= Role IDMC CAI Developer Job Location Remote Job Type FTE JD We are seeking a highly skilled and client-oriented Senior IICS Developer with strong hands-on experience in Informatica Intelligent Cloud Services (IDMC CDI) , Advanced SQL , and ideally Power BI . The ideal candidate will have prior experience working with US-based clients , possess strong production support exposure , and demonstrate excellent presentation and client interaction skills. Candidates with a background in Big 5 consulting environments are highly preferred. Key Responsibilities: Design, develop, and implement robust data integration solutions using Informatica IICS CDI . Write and optimize complex SQL queries for data extraction, transformation, and analysis. Work closely with business stakeholders and technical teams to gather requirements and deliver client-focused solutions. Provide production support for data integration pipelines and address incidents promptly. Collaborate with teams to ensure adherence to best practices, performance optimization, and code quality. (Nice to have) Develop insightful dashboards and visualizations using Power BI . Interact directly with US-based clients; participate in meetings and ensure professional, timely communication. Document solutions and provide knowledge transfer as needed. : 4+ years of experience in ETL/Data Integration , with at least 3+ years on Informatica IICS (IDMC CDI) . Strong proficiency in Advanced SQL and relational database design. Experience in production support environments and troubleshooting data integration jobs. Prior experience working with US clients and navigating stakeholder interactions effectively. Excellent communication and presentation skills. Highly client-focused and adaptable in dynamic environments. Big 5 consulting experience (e.g., Deloitte, PwC, EY, KPMG, Accenture) is a strong plus. (Preferred) Hands-on experience with Power BI or other data visualization tools. , Title=IDMC CDI Developer, ref=6566238

Posted 1 month ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=8 to 18 , jd= Job Title:-IICS CAI Sr Technical Architect Job Location:- Pune Job Type:- Full Time JD:- IICS CAI Sr Technical Architect Experience in IICS Application Integration components like Processes Service Connectors and process object Ability to integrating diverse cloud applications seamlessly and efficiently and build high volume mission critical cloud native applications Strong understanding across Cloud and infrastructure components server storage network data and applications to deliver end to end Cloud Infrastructure architectures and designs Build Service connectors for Real Time integration with Third party applications Experience in Integrating Informatica cloud with other applications like SAP Workday ServiceNow Experience in installing Addon connectors and drivers for IICS Expertise in using methodologies for data extraction transformation and loading data using various transformations like Expression Router Filter Lookup Update Strategy Union and Aggregator Strong technical experience building data integration processes by constructing mappings tasks taskflows schedules and parameter files Skills Mandatory Skills : Integration Architecture, Integration Patterns , Title=IICS CAI Sr Technical Architect, ref=6566399

Posted 1 month ago

Apply

4.0 - 6.0 years

3 - 6 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 6 , jd=10 BDC7A SummaryAs a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Optimize data storage and retrieval processes for improved performance.- Implement data governance policies and ensure data quality standards are met.- Stay updated with industry trends and best practices in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling and database design principles.- Experience in ETL processes and data integration techniques.- Knowledge of cloud platforms and services for data storage and processing.- Hands-on experience with data visualization tools for reporting and analysis. Additional Information- The candidate should have a minimum of 3 years of experience in Data Building Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required., Title=Data Building Tool, ref=6566428

Posted 1 month ago

Apply

6.0 - 8.0 years

5 - 8 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=6 to 8 , jd=9 HDC3B SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive successful project outcomes and foster a collaborative team environment. Roles & Responsibilities- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) processes and data warehousing concepts.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues related to data services. Additional Information- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required., Title=SAP BusinessObjects Data Services, ref=6566344

Posted 1 month ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

Ahmedabad

Work from Office

MIS Design & System Management Maintain and enhance spreadsheets and digital MIS tools aligned with project indicators and outcomes Collaborate with program teams to ensure system design aligns with log frames and donor requirements Create dashboards and trackers using Excel, Google Sheets, or Google Looker Studio Data Collection & Entry Coordinate and monitor data collection processes using digital platforms Validate and clean data sets to ensure consistency and reliability Provide support in digitizing data formats and improving collection tools Reporting & Documentation Generate periodic (weekly/monthly/quarterly) reports for internal teams and external partners Summarize data through charts, tables, and presentations for program reviews and strategic decisions Contribute to documentation including donor reports, case studies, and visual reports Data Quality & Monitoring Support Conduct data audits, validations, and troubleshoot discrepancies Use MIS tools to track project KPIs, outputs, and outcomes Support baseline, midline, and endline surveys with structured MIS inputs Training & Capacity Building Train staff and partners on MIS tools, data formats, and standard operating procedures Provide troubleshooting support and create/upkeep user guides and manuals Coordination & Collaboration Work closely with cross-functional teams to ensure accurate and timely data submissions Support dashboard development for project performance reviews Collaborate with M&E and IT teams to improve MIS effectiveness and data integration Mandatory Qualification And Experience Bachelors degree in Computer Science, Information Technology, Statistics, Data Science, or related fields 13 years of experience in MIS, data management, or M&E roles, preferably in the development/CSR sector Technical Skills Proficient in Advanced Excel (pivot tables, formulas, data validation, dashboards) Familiarity with Google Looker Studio, Google Sheets, and basic data visualization Hands-on experience with mobile data collection platforms like Kobo Toolbox, ODK, or Google Forms Understanding of MIS design principles aligned with M&E frameworks Soft Skills Strong analytical skills with attention to detail Excellent communication and presentation abilities Ability to multitask, prioritize responsibilities, and meet deadlines Team-oriented with a proactive and problem-solving mindset Why Join Us Work with passionate teams driving change at scale Enhance your skills in data systems and social impact measurement Be part of a dynamic work environment that values innovation, ownership, and collaboration How to apply Email your CV and a brief cover letter to career@csrbox org Subject Line: Application for Sr Associate MIS Coordinator Please Include Current Location Years of Relevant Experience Current and Expected CTC Notice Period A brief (150200 word) summary of your experience in CSR-health partnerships or donor-led projects

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About Boomi And What Makes Us Special Are you ready to work at a fast-growing company where you can make a differenceBoomi aims to make the world a better place by connecting everyone to everything, anywhere Our award-winning, intelligent integration and automation platform helps organizations power the future of business At Boomi, youll work with world-class people and industry-leading technology We hire trailblazers with an entrepreneurial spirit who can solve challenging problems, make a real impact, and want to be part of building something big If this sounds like a good fit for you, check out boomi com or visit our Boomi Careers page to learn more Essential Requirements 1+ yearsexperience in the software engineering industry, with experience supporting large scale software systems in production Working experience with AI technologies Strong understanding and working experience with GCP/Azure/AWS Experience in Ansible/Terraform and Python Operations and Incident Management Desirable Requirements Experience in developing terraform and automation for Infrastructure as code using Terraform and Cloud Formation Templates Basic understanding of Application Integration and/or Data Integration (ETL) Be Bold Be You Be Boomi We take pride in our culture and core values and are committed to being a place where everyone can be their true, authentic self Our team members are our most valuable resources, and we look for and encourage diversity in backgrounds, thoughts, life experiences, knowledge, and capabilities All employment decisions are based on business needs, job requirements, and individual qualifications Boomi strives to create an inclusive and accessible environment for candidates and employees If you need accommodation during the application or interview process, please submit a request to talent@boomi com This inbox is strictly for accommodations, please do not send resumes or general inquiries

Posted 1 month ago

Apply

2.0 - 6.0 years

8 - 12 Lacs

Pune

Work from Office

Job Summary ? Proficiency with major search engines and platforms such as Coveo, Elasticsearch, Solr, MongoDB Atlas, or similar technologies ? Experience with Natural Language Processing (NLP) and machine learning techniques for search relevance and personalization ? Ability to design and implement ranking algorithms and relevance tuning ? Experience with A/B testing and other methods for optimizing search results ? Experience with analyzing search logs and metrics to understand user behavior and improve search performance ? Deep understanding of indexing, data storage, and retrieval mechanisms (RAG) ? Experience with data integration, ETL processes, and data normalization ? Knowledge of scaling search solutions to handle large volumes of data and high query loads ? Strong knowledge of programming languages like C# NET, Python, or JavaScript for developing and customizing search functionalities ? Experience in integrating search solutions with various APIs and third party systems ? Understanding of how search interfaces impact user experience and ways to improve search usability and efficiency ? Experience with enterprise level systems and an understanding of how search integrates with broader IT infrastructure and business processes

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

About the Role Were looking for a Data Engineer to help build reliable and scalable data pipelines that power reports, dashboards, and business decisions at Hevo. Youll work closely with engineering, product, and business teams to make sure data is accurate, available, and easy to use. Key Responsibilities Independently design and implement scalable ELT workflows using tools like Hevo, dbt, Airflow, and Fivetran. Ensure the availability, accuracy, and timeliness of datasets powering analytics, dashboards, and operations. Collaborate with Platform and Engineering teams to address issues related to ingestion, schema design, and transformation logic. Escalate blockers and upstream issues proactively to minimize delays for stakeholders. Maintain strong documentation and ensure discoverability of all models, tables, and dashboards. Own end-to-end pipeline quality, minimizing escalations or errors in models and dashboards. Implement data observability practices such as freshness checks, lineage tracking, and incident alerts. Regularly audit and improve accuracy across business domains. Identify gaps in instrumentation, schema evolution, and transformation logic. Ensure high availability and data freshness through monitoring, alerting, and incident resolution processes. Set up internal SLAs, runbooks, and knowledge bases (data catalog, transformation logic, FAQs). Improve onboarding material and templates for future engineers and analysts Required Skills & Experience 3-5 years of experience in Data Engineering, Analytics Engineering, or related roles. Proficient in SQL and Python for data manipulation, automation, and pipeline creation. Strong understanding of ELT pipelines, schema management, and data transformation concepts. Experience with modern data stack : dbt, Airflow, Hevo, Fivetran, Snowflake, Redshift, or BigQuery. Solid grasp of data warehousing concepts: OLAP/OLTP, star/snowflake schemas, relational & columnar databases. Understanding of Rest APIs, Webhooks, and event-based data ingestion. Strong debugging skills and ability to troubleshoot issues across systems. Preferred Background Experience in high-growth industries such as eCommerce, FinTech, or hyper-commerce environments. Experience working with or contributing to a data platform (ELT/ETL tools, observability, lineage, etc.). Core Competencies Excellent communication and problem-solving skills Attention to detail and a self-starter mindset High ownership and urgency in execution Collaborative and coachable team player Strong prioritization and resilience under pressure

Posted 1 month ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 1 month ago

Apply

1.0 - 3.0 years

10 - 12 Lacs

Bengaluru

Remote

About the Role: We are looking for a talented Anaplan Modeler to join our team on a contract basis. You will be responsible for designing, developing, and maintaining Anaplan models to support business planning, forecasting, and other analytical processes. This role involves translating business requirements into efficient and scalable Anaplan solutions, collaborating with stakeholders, and ensuring the models meet performance and accuracy standards. Roles and Responsibilities: Must have Modelling experience in ANAPLAN projects, including implementations, upgrades, roll outs and/or support. Comfortable with creating Model, Modules, Lists, Line Items, Subsets, Line-Item Subsets, Usage of Calculation functions and dashboards using best practices. Been introduced or worked with Anaplan Optimizer, Integration methods and ALM within Anaplan. Ability to have direct discussions with clients to understand their needs and then design, develop, maintain and elaborate planning models. Anaplan Certified Model Builder Certification is a plus. Assist to Conduct, Document and Signoff Business Requirement with clients. Assign the User stories and assist in Sprint Planning. Hands on Modelling Experience in ANAPLAN Implementation focused on but not limited to Financial Forecasting, Supply Chain Planning and HR/Sales/Incentive Compensation Management or similar use cases. Strong background and experience in consulting roles focused on Sales Performance Planning / Supply chain / Financial Planning. Familiarity with SCRUM/Agile. Hands on in MS Excel using advanced formulae to develop Mock Ups for clients. Ability to effectively communicate with client team and in client facing roles. Qualifications: Any Bachelors degree in Finance, Accounting, Business, Computer Science, or a related field or MBA Finance. How will DataGrokr support you in your growth: You will be actively encouraged to attain certifications, lead technical workshops and conduct meetups to grow your own technology acumen and personal brand. You will work in a open culture that promotes commitment over compliance, individual responsibility over rules and bringing out the best in everyone

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Pune

Remote

Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies