Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
7 - 12 Lacs
Gurugram
Work from Office
Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
9.0 - 14.0 years
40 - 80 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a highly skilled Data Solutions Architect - Business Intelligence & AI to lead the design and delivery of advanced data solutions. This role requires a seasoned professional with deep technical expertise, consulting experience, and leadership capabilities to drive data transformation initiatives. The ideal candidate will play a pivotal role in architecting scalable data platforms, enabling AI-driven automation, and mentoring a team of data engineers and analysts.
Posted 3 weeks ago
5.0 - 7.0 years
10 - 14 Lacs
Pune
Work from Office
Project description A Tagetik Developer is responsible for designing, developing, and maintaining financial performance management solutions using the Tagetik platform. This role involves working closely with finance and IT teams to ensure the effective implementation and support of Tagetik applications. Responsibilities Development and CustomizationDevelop and customize Tagetik applications, including creating processes, workflows, and ETL (Extract, Transform, Load) solutions. Reporting and AnalyticsDesign and develop financial reports and dashboards using Tagetik's reporting tools and SQL queries. Data IntegrationManage data integration processes, ensuring accurate data flow between Tagetik and other systems. System ConfigurationConfigure Tagetik applications to meet business requirements, including setting up financial models, budgeting, and forecasting modules. Support and MaintenanceProvide ongoing support and maintenance for Tagetik applications, including troubleshooting issues and implementing enhancements. User TrainingConduct training sessions for end-users to ensure they are proficient in using Tagetik applications. CollaborationWork closely with functional heads and stakeholders to understand requirements and deliver solutions. Skills Must have Overall IT experience of 5-7 years with a minimum of 3 years in Tagetik application development. Hands-on experience with Analytical Information Hub (AIH). Expertise in ETL and DTPs. Experience in developing Forms and Reports. Proficiency in Tagetik Workflow, Data Processing, and Process Cockpit. Hands-on experience with JBOSS and Control-M is an added advantage. Problem-SolvingAbility to investigate and resolve complex problems. DocumentationExperience in creating technical design documents, unit test scripts, and code migration documents. ImplementationMinimum of 2 implementation experiences. SupportDeliver functional and technical Tagetik consolidation to support client needs. Preferred Qualifications Experience with financial modeling and forecasting within the Tagetik platform. Knowledge of budgeting, planning, and consolidation processes. Familiarity with other financial performance management tools and technologies. Nice to have NA
Posted 3 weeks ago
5.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project description A Tagetik Developer is responsible for designing, developing, and maintaining financial performance management solutions using the Tagetik platform. This role involves working closely with finance and IT teams to ensure the effective implementation and support of Tagetik applications. Responsibilities Development and CustomizationDevelop and customize Tagetik applications, including creating processes, workflows, and ETL (Extract, Transform, Load) solutions. Reporting and AnalyticsDesign and develop financial reports and dashboards using Tagetik's reporting tools and SQL queries. Data IntegrationManage data integration processes, ensuring accurate data flow between Tagetik and other systems. System ConfigurationConfigure Tagetik applications to meet business requirements, including setting up financial models, budgeting, and forecasting modules. Support and MaintenanceProvide ongoing support and maintenance for Tagetik applications, including troubleshooting issues and implementing enhancements. User TrainingConduct training sessions for end-users to ensure they are proficient in using Tagetik applications. CollaborationWork closely with functional heads and stakeholders to understand requirements and deliver solutions. Skills Must have Overall IT experience of 5-7 years with a minimum of 3 years in Tagetik application development. Hands-on experience with Analytical Information Hub (AIH). Expertise in ETL and DTPs. Experience in developing Forms and Reports. Proficiency in Tagetik Workflow, Data Processing, and Process Cockpit. Hands-on experience with JBOSS and Control-M is an added advantage. Problem-SolvingAbility to investigate and resolve complex problems. DocumentationExperience in creating technical design documents, unit test scripts, and code migration documents. ImplementationMinimum of 2 implementation experiences. SupportDeliver functional and technical Tagetik consolidation to support client needs. Preferred Qualifications Experience with financial modeling and forecasting within the Tagetik platform. Knowledge of budgeting, planning, and consolidation processes. Familiarity with other financial performance management tools and technologies. Nice to have NA
Posted 3 weeks ago
4.0 - 6.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills : Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description : As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills : Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Gurugram
Work from Office
Role Description : As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
7 - 12 Lacs
Gurugram
Work from Office
Role Description: As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
7.0 - 9.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description : As a Technical Lead - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills : Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 8.0 years
3 - 7 Lacs
Jharkhand
Remote
Job Summary: We are looking for a skilled Data Engineer with hands-on experience in SAP Data Services (SAP DS) and Snowflake to join our growing data engineering team. In this role, you will be responsible for designing, building, and maintaining data integration pipelines and ETL processes that move and transform data from SAP and other source systems into our Snowflake data warehouse. Key Responsibilities: Design, develop, and manage ETL workflows and jobs using SAP Data Services to extract, transform, and load data from various source systems (especially SAP ERP/SAP BW) into Snowflake. Implement data ingestion, transformation, and load strategies into Snowflake, ensuring high performance and scalability. Create and maintain Snowflake objects (e.g., tables, views, stages, file formats, procedures). Monitor and optimize ETL job performance and troubleshoot data pipeline issues. Ensure data quality, consistency, and reliability throughout the pipeline. Collaborate with business and analytics teams to understand data needs and deliver solutions. Maintain documentation related to data mappings, workflows, job designs, and data dictionaries. Support data governance, compliance, and security initiatives. Required Skills and Qualifications: 3+ years of experience working with SAP Data Services (BODS) for ETL development. 2+ years of hands-on experience with Snowflake SQL development, performance tuning, and architecture. Strong experience with data modeling , especially in a cloud data warehouse environment. Solid understanding of ETL best practices , error handling, and performance optimization. Experience in integrating data from SAP ECC, SAP BW , or other enterprise systems. Strong SQL skills and experience working with structured and semi-structured data (e.g., JSON, XML). Knowledge of data warehousing principles and methodologies. Strong analytical and problem-solving skills.
Posted 3 weeks ago
5.0 - 10.0 years
22 - 30 Lacs
Pune
Hybrid
Job Description: We are looking for a skilled Oracle EPM Cloud Consultant with a minimum of 8 years of experience to join our dynamic team. As a valuable team member, you will contribute to leveraging your knowledge and domain experience within EPM cloud products. This is an individual contributor role that involves collaboration with a global team of functional leaders, business partners, and technical experts to integrate Oracle technologies effectively, delivering superior business impact. Responsibilities: Gather requirements, design, develop, document, and support financial planning, budgeting, and reporting solutions using Oracle Cloud Planning Modules/Custom designs /Free Form. Construct robust integrations with Oracle Fusion GL, FCCS, and EDMCS, employing automation wizardry with Groovy, REST API, and Epmautomate. Hands-on experience in Tax Reporting Cloud Services (TRCS) and Environmental, Social and Governance ( ESG) modules. Data integration is a must-have skill along with Pipeline designs and mapping Attend client meetings and transform them into design solutions. Support customers during the month end closing to meet the deadline for external reporting. Must have gone through MEC challenges and become aware of its criticality Act as a subject matter expert, guiding moderately complex activities for successful project implementation. Develop technical and functional Oracle EPM Cloud admin and user documentation. Provide coaching/training to junior staff and actively manage personal and professional development. Essential Skills: Minimum 5 years of experience in Oracle EPM and Data Integration space Hands-on experience in implementing EPM Cloud applications, specifically using PBCS/Freeform (Enterprise Planning and Budgeting Cloud Services). At least one project implementation experience in TRCS or ESG module Solid experience of Integrations (Data Integration / Data management) of EPM Cloud applications. Experience with Business rules, Epmautomate, Groovy Script, REST API. Possess a strong command of Essbase ASO and BSO concepts. Experience of troubleshooting SmartView related issues. Ability to work independently, identify, troubleshoot, and resolve issues proactively. Openness to adopt new technologies. Oracle EPM certification is an added advantage. Preferred Skills: Experience with other Oracle EPM Cloud applications (Preferably FCCS & ARCS) is a plus. Candidate with a strong finance domain background is a plus.
Posted 3 weeks ago
2.0 - 7.0 years
7 - 11 Lacs
Noida
Work from Office
Critical Thinking Testing Concepts ETL Testing Python Experience Nice to Have API Understanding & Testing (Manual Automation) UI Automation (Able to identify UI elements programmatically. (This is for Selenium). Detailed Description : Critical Thinking - 5/5 High in logical reasoning and proactiveness Should come up with diverse test cases against requirements Testing Concepts - 5/5 Practice various test design techniques Clarity with priority and Severity Testing life cycle and defect management Understand regression v/s functional SQL/ETL/Batch - 4/5 Able to write SQL statements with aggregate functions and joins Understand data transformation Familiar with data loads and related validations Automation - 3/5 Should be able to solve a given problem programmatically Familiar with coding standards, version control, piepliens Able to identify UI elements programmatically API - 2/5 Understand how API works Various authorization mechanisms Validation of responses
Posted 3 weeks ago
4.0 - 7.0 years
8 - 12 Lacs
Noida
Work from Office
Database Designing, Data Modelling, and Core Component Implementation Data Integration and Relational Data Modelling Optimization and Performance Tuning Automating Backup and Purging Processes Reverse Engineering of Existing Applications Basic SQL Requirement Gathering and Technical Documentation Data Analysis for Migration of Database Objects and Application Data Data Analysis for Postgres or Oracle DB Good to Have Basic Knowledge of Cloud Architecture, Snowflakes, Python Development/Modification of Major Database Components Data Transformation Using SSIS Mandatory Competencies Database - Oracle - Data Modelling Database - Database Programming - SQL Beh - Communication Database - Sql Server - DBA Database - Other Databases - PostgreSQL
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Chennai
Work from Office
As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining Birlasoft, a global leader in Cloud, AI, and Digital technologies, known for seamlessly blending domain expertise with enterprise solutions. As part of the diversified CKA Birla Group, with over 12,000 professionals, you will contribute to the Groups 170-year heritage of building sustainable communities. We are currently seeking a Matillion Lead with 6-10 years of experience to join our team in PAN India. The ideal candidate should hold a B.E/B.Tech degree and have the following skills and qualifications: - Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. - 10+ years of experience in DWH, with 2-4 years of experience in implementing DWH on Snowflake using Matillion. - Design, develop, and maintain ETL processes using Matillion for data extraction, transformation, and loading into Snowflake. - Collaborate with data architects and business stakeholders to understand data requirements and provide technical solutions. - Lead the end-to-end system and architecture design for applications and infrastructure. - Conduct data validation, end-to-end testing of ETL objects, source data analysis, and data profiling. - Troubleshoot and resolve issues related to Matillion development and data integration. - Work with business users to create architecture aligned with business needs. - Develop project requirements for end-to-end data integration processes using ETL for structured, semi-structured, and unstructured data. - Strong understanding of ELT/ETL and integration concepts, as well as design best practices. - Experience in performance tuning of Matillion Cloud data pipelines and quick issue resolution. - Knowledge of Snowsql and Snowpipe will be an added advantage. - Follow best practices for Matillion development to ensure maintainability, scalability, and reusability of ETL processes. In addition to the technical skills, the ideal candidate should possess excellent presentation and communication skills to interact with both technical and non-technical stakeholders. Resilience and the ability to thrive in a dynamic, fast-paced work environment are also essential for this role.,
Posted 3 weeks ago
10.0 - 15.0 years
0 Lacs
maharashtra
On-site
You should have hands-on experience in setting up Questionnaires, Job requisitions, configuring Job Posting Templates, Career Sites, and Job Applications Business Process. Identify opportunities for automation and process improvements. Perform HRIS operational duties for Workday HCM, Absence Module, and Recruiting. Ensure data integrity, security, and compliance within Workday applications. Collaborate with HR and IT teams to align on business requirements and system enhancements. Manage and mentor a team of Workday specialists. Develop and maintain comprehensive documentation and training materials. Support business development activities and client engagements. Complete knowledge of using Workday Community is required. Ability to work with clients and drive design sessions for various HCM/Recruiting areas. Hands-on experience in business process configuration and building validation/condition rules. Understanding of HR programs and policies along with a commitment to HR principles of confidentiality. Data analysis and report building skills are essential. Experience in creating and troubleshooting EIBs (Enterprise Interface Builder). Ability to help customers resolve functional issues. In-depth knowledge of Workday security framework, calculated fields, custom reports, and setting up notifications. Understanding of various data sources and how to use them. Analyze Workday release updates to understand impacts of feature changes. Maintain the highest regulatory and compliance standards in handling employee records. Comprehensive experience in managing the full recruitment lifecycle within Workday. Must have created inbound and outbound integrations using Workday Studio, Core/Cloud Connectors, EIBs, and Document Transformation processes. Prior experience in Compensation & Benefits, Performance Management, and Annual Compensation Review cycle is preferred. Qualifications: - Bachelor's degree in a relevant field. - Total 10-15 years of work experience with a background in team handling. - Excellent project management and leadership skills. - Excellent Communication Skills.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
You will play a crucial role as a Data Modeler at Sun Life, where you will be responsible for designing and implementing new data structures to support project teams working on ETL, data warehouse design, managing the enterprise data model, data maintenance, and enterprise data integration approaches. Your technical responsibilities will include building and maintaining data models to report disparate data sets reliably, consistently, and in an interpretable manner. You will gather, distil, and harmonize data requirements to design conceptual, logical, and physical data models, as well as develop source-to-target mappings with complex ETL transformations. In this role, you will contribute to requirement analysis and database design, both in transactional and dimensional data modeling. You will work independently on data warehouse projects, collaborate with data consumers and suppliers to understand detailed requirements, and propose standardized data models. Additionally, you will help improve Data Management data models and facilitate discussions to understand business requirements and develop dimension data models based on industry best practices. To be successful in this position, you should have extensive practical experience in Information Technology and software development projects, with a minimum of 8 years of experience in designing operational data stores and data warehouses. Proficiency in data modeling tools such as Erwin or SAP Power Designer, a strong understanding of ETL and data warehouse concepts, and the ability to write complex SQL for data transformations and profiling are essential. Furthermore, you should possess a combination of solid business knowledge, technical expertise, excellent analytical and logical thinking, and strong communication skills. It would be advantageous if you have an understanding of the Insurance Domain, basic knowledge of AWS cloud services, experience with Master Data Management, Data Quality, Data Governance, and data visualization tools like SAS VA and Tableau. Familiarity with implementing and architecting data solutions using tools like Informatica, SQL Server, or Oracle is also beneficial. Join Sun Life's Advanced Analytics team and embark on a rewarding journey where you can contribute to making a positive impact on individuals, families, and communities worldwide.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role includes configuring CJA setups, managing data integrity in Adobe Experience Platform (AEP), and developing actionable insights via dashboards and reports. Additionally, you will integrate CJA with other Adobe Experience Cloud solutions and ensure best practices for data governance. Data integration and management involve pulling data from various sources and preparing it for analysis within Adobe CJA. Customer journey mapping includes visualizing customer interactions across touchpoints to identify key moments and potential pain points. Advanced analytics will require utilizing CJA features like path analysis, cohort analysis, and attribution modeling to understand customer behavior and campaign effectiveness. You will also be responsible for dashboard development, creating interactive dashboards to present key customer journey insights to stakeholders, and reporting and analysis, interpreting data to generate actionable insights and recommendations for improving customer experience. Your Profile should demonstrate technical expertise with a strong understanding of the Adobe Analytics platform, including CJA features, data models, and analysis techniques. Data analysis skills are essential, requiring proficiency in data manipulation, statistical analysis, and data visualization. Dashboard development skills are also necessary for creating interactive dashboards to present key customer journey insights to stakeholders. What you'll love about working here is the opportunity to shape your career with a range of career paths and internal opportunities within Capgemini group, as well as personalized career guidance from leaders. You will receive comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will also have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging, where you are valued for who you are, and can bring your original self to work. Every Monday, you can kick off the week with a musical performance by the in-house band - The Rubber Band, and participate in internal sports events, yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a responsible and diverse group of 340,000 team members in more than 50 countries, Capgemini has a strong over 55-year heritage and is trusted by clients to unlock the value of technology to address the entire breadth of their business needs. Capgemini delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with deep industry expertise and partner ecosystem.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly motivated CDP Analyst with over 3 years of experience in Customer Data Platforms (CDP) and Marketing Automation tools. Your expertise lies in Treasure Data CDP, where you have hands-on experience in data integration, audience segmentation, and activation across various marketing channels. Your role will involve managing data pipelines, constructing customer profiles, and assisting marketing teams in delivering personalized customer experiences. Your primary responsibilities will include: - Implementing, configuring, and managing Treasure Data CDP for collecting, unifying, and activating customer data. - Developing and maintaining data pipelines to ingest data from multiple sources into the CDP. - Creating and managing audience segments based on behavioral, transactional, and demographic data. - Collaborating closely with Marketing and CRM teams to integrate CDP data with various marketing automation platforms. - Setting up and monitoring data activation workflows to ensure accurate targeting across paid media, email, SMS, and push notifications. - Leveraging CDP data to generate actionable customer insights and support campaign personalization. - Monitoring data quality, creating reports, and optimizing customer journeys based on performance data. - Partnering with Data Engineering, Marketing, and IT teams to enhance data strategies and address data challenges. To excel in this role, you should possess: - 3+ years of experience working with CDP platforms, preferably Treasure Data, and Marketing Automation tools. - Strong SQL skills for querying and data manipulation. - Experience in integrating CDP with marketing channels such as Google Ads, Meta, Salesforce, Braze, etc. - Familiarity with APIs, ETL processes, and data pipelines. - Knowledge of customer journey orchestration and data privacy regulations like GDPR and CCPA. - Strong analytical and problem-solving abilities. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The position of Visualization & Interactions Science Consultant at S&C Global Network - AI - CG&S requires a skilled individual with a background in data visualization, human-computer interaction, and user experience design. As part of the team, you will be responsible for collaborating with CG&S clients and stakeholders to understand business requirements and translate them into impactful visualization and interaction design solutions. Your role will also involve providing consultation on visualization and interaction design principles, running codesign sessions, developing interactive data visualizations and dashboards, and conducting user research and usability testing to enhance the usability and effectiveness of visualization products. In this role, you will be expected to apply principles of data storytelling and narrative visualization to create compelling and informative data-driven stories. You will also be required to stay updated on emerging trends and best practices in data visualization, human-computer interaction, and user experience design, and share your insights with the team. Collaboration with data scientists, analysts, and engineers to integrate visualizations into data-driven applications and products is also a key aspect of the position. To excel in this role, you should possess relevant experience in the required domain, strong analytical and problem-solving skills, as well as effective communication skills. The ability to thrive in a fast-paced, dynamic environment is essential for success in this position. This opportunity offers the chance to work on innovative projects and provides avenues for career growth and leadership exposure. If you are passionate about leveraging data-driven insights to create impactful visualizations and interactive experiences for businesses, this role presents an exciting opportunity to contribute to strategic initiatives and business transformations with your expertise in visualization and interaction design.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Python API Developer specializing in Product Development, you will leverage your 4+ years of experience to design, develop, and maintain high-performance, scalable APIs that drive our Generative AI products. Your role will involve close collaboration with data scientists, machine learning engineers, and product teams to seamlessly integrate Generative AI models (e.g., GPT, GANs, DALL-E) into production-ready applications. Your expertise in backend development, Python programming, and API design will be crucial in ensuring the successful deployment and execution of AI-driven features. You should hold a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. Your professional experience should demonstrate hands-on involvement in designing and developing APIs, particularly with Generative AI models or machine learning models in a production environment. Proficiency in cloud-based infrastructures (AWS, Google Cloud, Azure) and services for deploying backend systems and AI models is essential. Additionally, you should have a strong background in working with backend frameworks and languages like Python, Django, Flask, or FastAPI. Your core technical skills include expertise in Python for backend development using frameworks such as Flask, Django, or FastAPI. You should possess a strong understanding of building and consuming RESTful APIs or GraphQL APIs, along with experience in designing and implementing API architectures. Familiarity with database management systems (SQL/NoSQL) like PostgreSQL, MySQL, MongoDB, Redis, and knowledge of cloud infrastructure (e.g., AWS, Google Cloud, Azure) are required. Experience with CI/CD pipelines, version control tools like Git, and Agile development methodologies is crucial for automating deployments and ensuring efficient backend operations. Key responsibilities will involve closely collaborating with AI/ML engineers to integrate Generative AI models into backend services, handling data pipelines for real-time or batch processing, and engaging in design discussions to ensure technical feasibility and scalability of features. Implementing caching mechanisms, rate-limiting, and queueing systems to manage AI-related API requests, as well as ensuring backend services can handle high concurrency during resource-intensive generative AI processes, will be essential. Your problem-solving skills, excellent communication abilities for interacting with cross-functional teams, and adaptability to stay updated on the latest technologies and trends in generative AI will be critical for success in this role.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
You will play a critical role as a CRM Administrator at NxtWave, where you will be responsible for managing and optimizing the CRM system to support the sales, marketing, and customer service teams effectively. Your main focus will be on maintaining data integrity, user management, system integrations, and driving continuous improvements to enhance CRM performance and user experience. Collaborating closely with key stakeholders, you will align the CRM system with business goals to ensure it delivers value to the organization. Your responsibilities will include customizing and configuring the CRM system, setting up workflows and automations, managing data accuracy and integrity, providing user support and training, integrating the CRM system with other tools, and ensuring compliance with data security measures and industry regulations. To excel in this role, you should be detail-oriented, a problem solver, collaborative, adaptable, proactive, an effective communicator, and a continuous learner. A Bachelor's degree is required, along with proven experience as a CRM Administrator or in a similar role. Strong communication and interpersonal skills are essential, and proficiency in CRM configuration, data management, user management, and technical skills such as data integration and CRM tools will be advantageous. Joining NxtWave will provide you with opportunities for professional development, a supportive and innovative environment that values continuous improvement, and a chance to contribute to the success of the organization by ensuring the CRM system delivers value to teams and customers.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39973 Jobs | Dublin
Wipro
19601 Jobs | Bengaluru
Accenture in India
16747 Jobs | Dublin 2
EY
15791 Jobs | London
Uplers
11569 Jobs | Ahmedabad
Amazon
10606 Jobs | Seattle,WA
Oracle
9430 Jobs | Redwood City
IBM
9385 Jobs | Armonk
Accenture services Pvt Ltd
8587 Jobs |
Capgemini
7916 Jobs | Paris,France