Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
4 - 8 Lacs
Ghaziabad
Work from Office
The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Greater Noida
Work from Office
The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Noida
Work from Office
The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.
Posted 3 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Jaipur
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 3 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Faridabad
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 3 weeks ago
8.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Job Description: Experience in using ETL tools, database management, scripting (primarily Python), API consumption, Source to Target mapping and Advanced SQL queries. Designs, builds, and maintains scalable data pipelines and architectures on Microsoft Azure cloud platforms Cloud experience is preferred. In addition to technical skills, this candidate should possess excellent communication skills and the ability to work autonomously and with minimal direction. Develops and optimizes complex ETL processes, monitors system performance, and troubleshoots data-related issues in production environments
Posted 3 weeks ago
5.0 - 7.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Roles & Responsibilities : We are seeking a motivated Data Integration Engineer to support the development and operation of reliable, efficient, and secure data integration workflows across various systems. The candidate will work closely with senior engineers and cross-functional teams to understand integration requirements and deliver robust data pipelines. Work experience: 3 5 years of experience in data integration, ETL/ELT workflows, or API-based integration. Exposure to integration tools and platforms like Azure Data Factory, MuleSoft, Boomi, Talend, or Informatica. Familiarity with REST/SOAP APIs and data formats including JSON, XML, CSV. Experience with SQL and NoSQL databases such as PostgreSQL, SQL Server, MongoDB. Exposure to scripting languages such as Python, Shell scripting, or JavaScript. Understanding of message queuing technologies (e. g. , Kafka, RabbitMQ, Azure Service Bus). Basic understanding of cloud platforms (Azure, AWS, or GCP). Exposure to version control systems (e. g. , Git), CI/CD tools, and Agile methodologies. Basic understanding of data warehousing concepts and dimensional modelling. Exposure to monitoring tools such as Grafana, Prometheus, or ELK stack. Required Skills & Qualifications: Good analytical and debugging skills. Strong written and verbal communication. Willingness to learn and adopt new technologies. Ability to collaborate effectively within a team environment. Understanding of secure data handling and governance principles. Ability to create technical documentation and flow diagrams. Enthusiastic attitude toward continuous improvement. Awareness of data security, compliance (e. g. , GDPR), and API rate-limiting practices. Knowledge of error handling, retry policies, and scheduling mechanisms.
Posted 3 weeks ago
6.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Roles & Responsibilities : We are looking for a seasoned Senior Data Integration Engineer to lead and govern the architecture and implementation of scalable, resilient, and high-performance integration solutions. The candidate will drive design strategies, provide technical leadership, and coordinate with cross-functional teams to enhance enterprise data integration capabilities Work experience: 7 10+ years of hands-on experience in data integration, middleware, and enterprise data engineering. Strong expertise in ETL/ELT tools like Azure Data Factory, Informatica, Talend, or SAP BODS. Experience with iPaaS platforms (e. g. , MuleSoft, Boomi, Celigo). Deep understanding of API design (REST, SOAP, GraphQL) and lifecycle management. Proven experience in designing integration solutions on Azure/AWS/GCP. Skilled in working with structured and semi-structured data (e. g. , XML, JSON, Parquet). Expertise in messaging and event-driven architectures using Kafka, Azure Event Hub, RabbitMQ. Proficient in DevOps, CI/CD, containerization (Docker, Kubernetes). Experience with monitoring tools (Splunk, Kibana, Azure Monitor, Dynatrace). Experience in designing resilient systems with retry, dead-letter queues, and failover strategies. Understanding of metadata management, data catalogs, and data lineage tools. Required Skills & Qualifications: In-depth knowledge of data integration patterns (batch, real-time, streaming). Ability to architect scalable and secure data workflows. Strong understanding of data governance, lineage, and cataloging. Experience with version control, testing frameworks, and agile methodologies. Excellent stakeholder management and leadership skills. Capability to mentor junior engineers and conduct technical reviews. Familiarity with data quality frameworks and master data management. Strong documentation and presentation skills. Adept at translating business requirements into technical solutions. Passionate about innovation and digital transformation through data. Knowledge of performance tuning, memory optimization, and parallel processing. Experience integrating with third-party platforms, SaaS tools, and legacy systems. Strong command over orchestration tools and workflow automation platforms.
Posted 3 weeks ago
7.0 - 8.0 years
15 - 16 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Design and Develop Scalable Data Pipelines: Architect and implement end-to-end data workflows using Apache Airflow for orchestration, integrating multiple data sources and sinks across cloud and on-prem environments. BigQuery Data Modeling and Optimization: Build and optimize data models in Google BigQuery for performance and cost-efficiency, including partitioning, clustering, and materialized views to support analytics and reporting use cases. ETL/ELT Development and Maintenance: Design robust ETL/ELT pipelines to extract, transform, and load structured and semi-structured data, ensuring data quality, reliability, and availability. Cloud-Native Engineering on GCP: Leverage GCP services like Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions to build resilient, event-driven data workflows. CI/CD and Automation: Implement CI/CD for data pipelines using tools like Cloud Composer (managed Airflow), Git, and Terraform, ensuring automated deployment and versioning of workflows. Data Governance and Security: Ensure proper data classification, access control, and audit logging within GCP, adhering to data governance and compliance standards. Monitoring and Troubleshooting: Build proactive monitoring for pipeline health and data quality using tools such as Stackdriver (Cloud Monitoring) and custom Airflow alerting mechanisms. Collaboration and Stakeholder Engagement: Work closely with data analysts, data scientists, and business teams to understand requirements and deliver high-quality, timely data products. Requirements over all 5+years of experience Mandatory 2+ hands on working experience on GCP Bigquery ( Mandatory) Mandatory 2+ hands on working experience on Apache Airflow (Mandatory) Mandatory 2+ hands on working experience on Python (Mandatory) Mandatory 2+ hands on working experience on Linux/Unix (Mandatory) Mandatory 2+ hands on working experience on PL/SQL Scripting (Mandatory) Mandatory 2+ hands on working experience on ETL tools (Mandatory)- (Mandatory) Data stage/ Informatica/ Prophecy. GCP Certification on ACE (Associate Cloud Engineer) is added advantage. .
Posted 3 weeks ago
7.0 - 12.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Key Responsibilities: Data Engineering & Architecture: Design, develop, and maintain high-performance data pipelines for structured and unstructured data using Azure Data Bricks and Apache Spark. Build and manage scalable data ingestion frameworks for batch and real-time data processing. Implement and optimize data lake architecture in Azure Data Lake to support analytics and reporting workloads. Develop and optimize data models and queries in Azure Synapse Analytics to power BI and analytics use cases. Cloud-Based Data Solutions: Architect and implement modern data lakehouses combining the best of data lakes and data warehouses. Leverage Azure services like Data Factory, Event Hub, and Blob Storage for end-to-end data workflows. Ensure security, compliance, and governance of data through Azure Role-Based Access Control (RBAC) and Data Lake ACLs. ETL/ELT Development: Develop robust ETL/ELT pipelines using Azure Data Factory, Data Bricks notebooks, and PySpark. Perform data transformations, cleansing, and validation to prepare datasets for analysis. Manage and monitor job orchestration, ensuring pipelines run efficiently and reliably. Performance Optimization: Optimize Spark jobs and SQL queries for large-scale data processing. Implement partitioning, caching, and indexing strategies to improve performance and scalability of big data workloads. Conduct capacity planning and recommend infrastructure optimizations for cost-effectiveness. Collaboration & Stakeholder Management: Work closely with business analysts, data scientists, and product teams to understand data requirements and deliver solutions. Participate in cross-functional design sessions to translate business needs into technical specifications. Provide thought leadership on best practices in data engineering and cloud computing. Documentation & Knowledge Sharing: Create detailed documentation for data workflows, pipelines, and architectural decisions. Mentor junior team members and promote a culture of learning and innovation. Required Qualifications: Experience: 7+ years of experience in data engineering, big data, or cloud-based data solutions. Proven expertise with Azure Data Bricks, Azure Data Lake, and Azure Synapse Analytics. Technical Skills: Strong hands-on experience with Apache Spark and distributed data processing frameworks. Advanced proficiency in Python and SQL for data manipulation and pipeline development. Deep understanding of data modeling for OLAP, OLTP, and dimensional data models. Experience with ETL/ELT tools like Azure Data Factory or Informatica. Familiarity with Azure DevOps for CI/CD pipelines and version control. Big Data Ecosystem: Familiarity with Delta Lake for managing big data in Azure. Experience with streaming data frameworks like Kafka, Event Hub, or Spark Streaming. Cloud Expertise: Strong understanding of Azure cloud architecture, including storage, compute, and networking. Knowledge of Azure security best practices, such as encryption and key management. Preferred Skills (Nice to Have): Experience with machine learning pipelines and frameworks like MLFlow or Azure Machine Learning. Knowledge of data visualization tools such as Power BI for creating dashboards and reports. Familiarity with Terraform or ARM templates for infrastructure as code (IaC). Exposure to NoSQL databases like Cosmos DB or MongoDB. Experience with data governance to Job ID R-72547 Date posted 07/17/2025
Posted 3 weeks ago
5.0 years
0 Lacs
West Bengal, India
On-site
Job Title: Master Data Management (MDM) Developer Location: Kolkata Mode: Work from Office Type: Full-time Key Responsibilities: Proven experience (5-7+ years) in Master Data Management (MDM), with a strong preference for Reltio. Deep domain knowledge within the Life Sciences, Pharmaceutical, or Healthcare sectors. Strong understanding of core MDM concepts, serving as a subject matter expert to ensure successful platform deployments, particularly with Reltio or Informatica MDM. Experience integrating MDM solutions with major CRM and ERP systems, such as Salesforce, Veeva, or One Key. Proficiency in key MDM functions, including data modeling, data ingestion, match and survivorship rules, data cleansing, metadata analysis, and outbound configurations. Ability to analyze a client's customer lifecycle and business processes to understand how customer data is captured, analyzed, and used for sales, marketing, and customer service. A strong grasp of Data Governance and Stewardship processes, including hierarchy and workflow management, manual merges, and survivorship. Skilled in defining requirements for data quality rules, matching and merging criteria, and data stewardship workflows. A continuous learner who stays current with new technologies and their strategic applications. Capable of independent development and leading code reviews to ensure quality throughout the MDM project lifecycle. Nice to Have: Experience with Java & Python Knowledge of SQL and Snowflake
Posted 3 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Warangal, Hyderabad, Nizamabad
Work from Office
Bachelor s degree in computer science or similar field or equivalent work experience. 4+ years of hands-on Software/Application development experience with Informatica PowerCenter and PowerBI (visualization and modeling). Extensive working knowledge of PowerBI large data sets and modelling Extensive knowledge in DAX coding Experience in Performance analysis and tuning and Knowledge in troubleshooting tools like Tabular editor, DAX studio Experience in Incremental, Hybrid data refreshing methods Knowledge around PowerBI service and capacity managements In-depth knowledge of data warehousing and worked in star schema concepts like facts/dimension tables. Strong PowerBI modeling and data visualization experience in delivering projects. Strong in SQL and PL-SQL Strong Data Warehousing Database fundamentals in MS SQL server DB. Strong in performance testing and troubleshooting of application issues using informatica logs.
Posted 3 weeks ago
2.0 - 4.0 years
6 - 9 Lacs
Chennai
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . s Possess handson experience in IICS / Informatica Powercenter. Demonstrated involvement in endtoend IICS/IDMC project. Possess handson experience in Informatica PowerCenter Structured Query Language (SQL) Data Warehouse expertise Experience in Extract, Transform, Load (ETL) Testing Effective communication skills Key Responsibilities Design and develop ETL processes using Informatica IICS / Informatica Powercenter. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Should have good expertise on IICS Data integration / Informatica Powercenter and Application Integration, Oracle SQL Implement data integration solutions that ensure data accuracy and consistency. Monitor and optimize existing workflows for performance and efficiency. Troubleshoot and resolve any issues related to data integration and ETL processes. Maintain documentation for data processes and integration workflows. Mandatory skill sets ETL Informatica Preferred skill sets ETL Informatica Years of experience required 2 4 yrs Education qualification BTech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills ETL (Informatica) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 3 weeks ago
2.0 - 4.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About the role Youll be at the heart of developing and maintaining our sophisticated in-house insurance products built on relational or document databases. You will have the opportunity to join one of our product teams and contribute to the development of functionality which generates real business impact. About the team We are a team that believes in engineering excellence and that our leaders should also be engineers themselves. We build applications that are carefully designed, thoughtfully implemented, and surpass the expectations of our users by working together with product owners. Quality and stability are first-class deliverables in everything we do, and we lead by example by embedding high standards into our processes. Your responsibilities include Designs, develops, deploys, and supports sustainable data/solutions architectures such as design patterns, reference data architecture, conceptual, logical, and physical data models for both Relational and NoSQL DB Data migration / ingestion / transfer from / to heterogeneous databases and file types. Performance Optimization (Query fine tuning, indexing strategy etc.) Support project team conduct Public Cloud Data Growth and Data Service Consumption assessment and forecast Collaborate effectively within a cross-functional team including requirements engineers, QA specialists, and other application engineers. Stay current with emerging technologies and Generative AI developments to continuously improve our solutions. About you Youre a naturally curious and thoughtful professional who thrives in a high-performance engineering environment. Your passion for coding is matched by your commitment to delivering business value. You believe in continuous learning through self-improvement or by absorbing knowledge from those around you and youre excited to contribute to a team that values technical excellence. You should bring the following skills and experiences Proficient in Relational and NoSQL DBs Proficient in PL/SQL programming Strong data model and database design skill for both Relational and NoSQL Experience with seamless data integration using Informatica and Azure Data Factory Must have previous public cloud experience, particularly with Microsoft Azure
Posted 3 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
For over 30 years, Beghou Consulting has been a trusted adviser to life science firms. We combine our strategic consulting services with proprietary technology to develop custom, data-driven solutions that allow life sciences companies to take their commercial operations to new heights. We are dedicated to client service and offer a full suite of consulting and technology services, all rooted in advanced analytics, to enhance commercial operations and boost sales performance. Purpose of Job As an Associate Consultant in a Data Warehousing project encompass a range of technical, analytical, and collaborative tasks essential for effective data management. You will play a pivotal role to ensure that the organization can effectively leverage its data assets for informed decision-making. Well trust you to: Undertake primary ownership in driving self and team member s effort across all phases of a project lifecycle Oversees and develops Data Warehouse projects to ensure methodological soundness; deliver quality client deliverables within expected timeframe Assist in designing data models that reflect business needs and support analytical objectives Partner with Project lead/ Program lead in delivering projects and assist in project management responsibility like - project plans, and risk mitigation Develop and maintain comprehensive documentation for data warehouse architecture, ETL processes, and system configurations to ensure clarity and compliance Participate in design reviews and code reviews to ensure adherence to quality standards and best practices in data warehousing Foster collaboration with onshore and offshore team members to ensure seamless communication and effective task execution Lead task planning and task distribution across team members and ensure timely completion with high quality and report accurate status to project lead Mentor/coach the junior members in the team Oversee direct and indirect client communications based on assigned project responsibilities: -Foster a culture of continuous improvement and innovation while demonstrating the ability to learn new technologies, business domains, and project management processes -Analyze problem statements and client requirements to design and implement complex solutions using Programing languages, ETL platform Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management Provides project leadership for team members regarding process improvements, planned approaches for client requests, or transition of new deliverables. Youll need to have: 2 5 years of hands-on experience in Master Data Management (MDM), within relevant consulting-industry experience Strong understanding of core MDM concepts, including data stewardship, data governance, and data quality management Proven experience in Customer Mastering (HCP/HCO) & address mastering and standardization using third party tools Proficient in data profiling, applying data quality rules, data standardization, and reference data management Familiarity with key pharma data sources such as IQVIA, MedPro, Symphony, etc. Ability to translate both functional and non-functional business requirements into scalable, efficient MDM solutions Hands on experience with leading MDM platforms such as Reltio, Informatica MDM, etc. is a plus Strong logical thinking and problem-solving skills, with a collaborative mindset Proficiency in Python for scripting and data manipulation Strong SQL skills for data extraction and analysis
Posted 3 weeks ago
7.0 - 9.0 years
11 - 16 Lacs
Gurugram
Work from Office
Role Description : As a Technical Lead - Datawarehousing Development at Incedo, you will be responsible for designing and developing data warehousing solutions. You should have experience with ETL tools such as Informatica, Talend, or DataStage and be proficient in SQL. Roles & Responsibilities: Design and develop data warehousing solutions using tools like Hadoop, Spark, or Snowflake Write efficient and optimized ETL scripts Collaborate with cross-functional teams to develop and implement data warehousing features and enhancements Debug and troubleshoot complex data warehousing issues Ensure data security, availability, and scalability of production systems Technical Skills Skills Requirements: Proficiency in ETL (Extract, Transform, Load) processes and tools such as Informatica, Talend, or DataStage. Experience with data modeling and schema design for data warehousing applications. Knowledge of data warehouse technologies such as Amazon Redshift, Snowflake, or Oracle Exadata. Familiarity with business intelligence (BI) tools such as Tableau, Power BI, or QlikView. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred .
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Gurugram
Work from Office
Role Description : As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description : As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title - Full Stack Developer (Mainly DB , ETL experience) Location - Pune, India Role Description Currently DWS sources technology infrastructure, corporate functions systems [Finance, Risk, Legal, Compliance, AFC, Audit, Corporate Services etc.] and other key services from DB. Project Proteus aims to strategically transform DWS to an Asset Management standalone operating platform; an ambitious and ground-breaking project that delivers separated DWS infrastructure and Corporate Functions in the cloud with essential new capabilities, further enhancing DWS’ highly competitive and agile Asset Management capability. This role offers a unique opportunity to be part of a high performing team implementing a strategic future state technology landscape for all of DWS Corporate Functions globally. We are seeking a highly skilled and experienced ETL developer with Informatica tool experience along with strong development experience on various RDMS along with exposure to cloud-based platforms. The ideal candidate will be responsible for designing, developing, and implementing robust and scalable custom solutions, extensions, and integrations in a cloud-first environment. This role requires a deep understanding of data migration, system integration, and optimization, cloud-native development principles, and the ability to work collaboratively with functional teams and business stakeholders. The role needs to provide support to the US business stakeholders & regulatory reporting processes during morning US hours. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities This role will be primarily responsible for creating good quality software designs and hence strong sense of software design principles is required Will get involved with hands-on code development Thorough testing of developed software Mentor junior team members in both technical and functional front Do code review of other team members Participate and manage daily stand-up meetings Articulate issues and risks to management in timely manner This role will require 80% Technical involvement and 20% on other activities like team handling, mentoring, status reporting, year-end appraisals Analyse software defects and fix them in timely manner Work closely with Stakeholders and other teams like Functional Analysis and Quality Assurance teams. Supports testing on behalf of users, operations, and testing teams potentially including test plans, test cases, test-data and review of interface testing, between different applications, when required. Work with application developers to resolve functional issues from UATs, and to help find solutions for various functional difficulty areas. Works closely with business analysts detail proposed solutions and solution maintenance. Work with Application Management area for functional area trouble shooting and resolution to reported bugs / issues on applications. Your Skills And Experience Bachelor’s Degree from an accredited college or university with a concentration in Science or an IT-related discipline (or equivalent) Should be hands on in technology Minimum 10 years of IT industry experience Proficient in Informatica or any ETL tool Hands-on experience on Oracle SQL/PL SQL Exposure to PostgreSQL Exposure to Cloud / Big Data technology CI/CD (Team City or Jenkins) GitHub usage Basic commands on UNIX Exposure to Control-M scheduling tool Worked in Agile/Scrum software development environment. High analytical capabilities Proven communication skills Must be an effective problem solver Able to Multi-task and work under tight deadlines Identifying and escalating problems at an early stage Flexibility and willingness to work autonomously Self-motivated within set competencies in a team and fast paced environments High degree of accuracy and attention to detail Nice to have Any exposure to PySpark will be Plus Any exposure to React JS or Angular JS will be plus Architecting and automating the build process for production, using scripts Worked in Agile/Scrum software development atmosphere. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Oracle Engineer Location: Pune, India Corporate Title: Associate Role Description You will maintain, enhance, and optimize an Oracle-based Price Valuation application (G-VOR), with a focus on delivering robust solutions to meet business needs What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Collaborating with business stakeholders to design and implement new features, primarily through Oracle PL/SQL development Ensuring application stability by analyzing and resolving data-related inquiries from the business, performing performance tuning, and optimizing processes Maintaining and enhancing data load, processing, and storage Supporting the team in migrating the application’s front-end to a modern ReactJS/Spring Boot technology stack, leveraging a Microservices-oriented architecture hosted on the Google Cloud Platform Your Skills And Experience Master’s degree (or equivalent) in Computer Science, Business Information Technology, or a related field Demonstrated expertise in Oracle PL/SQL development, with significant professional experience working on relational databases—this is a critical requirement for the role Strong analytical and problem-solving skills Familiarity with an ETL tool (e.g., Informatica) and/or a reporting tool (e.g., Cognos or Tableau) is highly desirable Experience in one or more of the following areas is advantageous – but not a must: batch programming, Java/JavaScript programming (including ReactJS), or Microservices architecture Fluency in written and spoken English How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Database Engineer - CF, AS Location: Pune, India Corporate Title: AS Role Description DWS Technology is a global team of technology specialists, spread across multiple trading hubs and tech centres. We have a strong focus on promoting technical excellence – our engineers work at the forefront of financial services innovation using cutting-edge technologies. Our India location is our most recent addition to our global network of tech centres and growing strongly. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create #GlobalHausbank solutions from our home market. DWS Corporate Function Technology DWS Corporate Function Technology team covers technology for corporate function like finance, risk, ALM, AFC, etc. This position is for the SIMS application specifically which is a financial data warehouse which provides KPIs for management, quarterly and annual reporting among other things. The application consists of an Oracle database and a Java web front-end. As a database Engineer, you will be responsible for maintaining, enhancing, and optimizing the application in collaboration with the engineering team and business. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities You will support the application team by maintaining, enhancing, and optimizing an Oracle-based financial data warehouse application (SIMS) and well as a web-application with a MSSQL database backend (FBRAE) with a focus on delivering robust solutions to meet business needs. Your responsibilities include: Collaborating with business stakeholders to design and implement new features, primarily through database development (PL/SQL, T-SQL) Ensuring application stability by analyzing and resolving data-related inquiries from the business, performing performance tuning, and optimizing processes Maintaining and enhancing reporting data marts built on a Data Vault architecture Supporting the team in migrating the application’s front-end to a modern ReactJS/Spring Boot technology stack, leveraging a Microservices-oriented architecture hosted on the Google Cloud Platform Your Skills And Experience Master’s degree (or equivalent) in Computer Science, Business Information Technology, or a related field Demonstrated expertise in Oracle PL/SQL or MS T-SQL development, with significant professional experience working on relational databases—this is a critical requirement for the role Strong analytical and problem-solving skills Familiarity with an ETL tool (e.g., Informatica) and/or a reporting tool (e.g., Cognos) is desirable Experience in one or more of the following areas is advantageous: batch programming, Java/JavaScript programming (including ReactJS), or Microservices architecture Fluency in written and spoken English How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
9.0 - 14.0 years
40 - 80 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a highly skilled Data Solutions Architect - Business Intelligence & AI to lead the design and delivery of advanced data solutions. This role requires a seasoned professional with deep technical expertise, consulting experience, and leadership capabilities to drive data transformation initiatives. The ideal candidate will play a pivotal role in architecting scalable data platforms, enabling AI-driven automation, and mentoring a team of data engineers and analysts.
Posted 3 weeks ago
5.0 - 7.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Novo Nordisk Global Business Services (GBS) India The position As an Associate Business Analyst at Novo Nordisk, you will be responsible for creating and maintaining master data in SAP and Winshuttle in alignment with established business processes and rules. You will also handle Change Requests (CR-cases) and Development Requests (DV) related to master data creation. You will be entrusted with the below responsibilities: Create and maintain master data for raw, semi-finished, and finished goods in SAP ECC and Winshuttle, including Bill of Materials (BOM) individually and in mass, and manage change-controlled objects using Engineering Change Master. Support global serialization master data maintenance by updating specification sheets, addressing ad hoc requests, and executing related support tasks. Manage material master data across the product life cycleincluding creation, maintenance, and deactivationby leading and documenting product change control, analyzing material usage and inventory trends, delivering actionable reports, and supporting cross-functional projects to ensure timely execution. Perform data cleansing to ensure accuracy and integrity and create and maintain Standard Operating Procedures (SOPs) and work instructions in compliance with Novo Nordisk standards. Identify process improvement opportunities within master data management and standardization and ensure creation and maintenance of SOPs and work instructions in alignment with Novo Nordisk standards. Manage stakeholder relationships effectively, support new process implementations, ensure adherence to defined KPIs, and contribute to training and onboarding of new joiners. Qualifications Bachelors degree in supply chain management, production, mechanical engineering, or equivalent from a well-recognised institute. 5 to 7 years of experience within SAP master data, preferably within pharma or supply chain. Experience with master data. Ability to analyse and process data. Must have experience in Product Life Cycle Management, S/4HANA, Winshuttle, SAP ECC, and Master Data Management. Good understanding of supply chain concepts (Plan, Make, Source, Deliver, and Return) and the supporting master data. Proficient user of Microsoft Office (Excel, PowerPoint). Experience in automation with advanced Excel and building macros or have ETL knowledge with Informatica/Winshuttle. Experience in conducting meetings with peers, including preparation and facilitation. Knowledge of business rules for processes and attributes within SAP. Excellent communication skills in English, both written and oral.
Posted 3 weeks ago
6.0 - 10.0 years
4 - 8 Lacs
Hyderabad, Bengaluru
Work from Office
Overview: RSM is seeking a highly skilled and experienced Informatica MDM Development Supervisor to lead our master data management initiatives. The ideal candidate will have a strong background in Informatica Saas MDM including IDMC, Multi Domain 360, Data Marketplace and Catalogue, excellent leadership skills, and a proven track record of managing complex implementation projects. This role will involve overseeing a team of developers, collaborating with cross-functional teams, and ensuring the successful delivery of MDM solutions. Essential Duties and Responsibilities: Implement an enterprise-wide data governance framework, with a focus on data quality, synchronization, and standardization through processes, data monitoring, data remediation, training, and documentation Utilize your deep knowledge of Informatica MDM tools and technologies to design, develop, and implement robust master data management solutions. Stay up-to-date with the latest industry trends and best practices. Manage the end-to-end lifecycle of MDM projects, including planning, execution, monitoring, and delivery. Ensure projects are completed on time, within scope, and within budget. Serve as a liaison between the functional data owners and the IT data owners to understand needed enhancements to integrations or data transformations to provide necessary controls and oversight Develop and maintain a data dictionary, glossary, and process documentation to be distributed or made available to all employees Own and drive a detailed communication plan which includes running weekly, monthly, and quarterly meetings with key data stakeholders and leadership Create or enhance data visibility of data quality through analytics dashboards, key KPIs and other metrics and measurements Troubleshoot and resolve complex technical issues related to MDM processes. Develop and monitor processes to track data requests, reporting needs, and serve as a point of contact to ensure communication is targeted to appropriate technical and functional resources Work with IT to develop and maintain a data inventory in the analytics warehouse, which includes a full list of available data models, model attributes, statical reporting inventory, ad hoc data marts, and user access Develop, maintain, and create go to market offerings utilizing the Informatica SaaS platforms Work with IT leadership to ensure the proper controls are being managed and reviewed for all data integrations Own the single source of truth conceptual model in the data warehouse and ensure that all enrichments adhere to the model design principles Collaborate with IT to ensure privacy/security and access to data is properly aligned to organizational standards Required qualifications: Bachelor's degree or higher in Computer Science, Information Technology, Business Administration, Engineering, Data Analytics or Data Science 6+ Years of experience in the professional services industry 4+ Years of experience working with Informatica SaaS MDM, IDMC and other governance projects. 2+ Years of enterprise level project management experience Knowledge of data governance frameworks (like DAMA-DMBOK or DCAM) or have demonstrated the ability to implement data strategies across disparate data systems Hands-on experience with Multi-Domain SaaS Informatica platforms, CDI, IDMC Experience with, at minimum three, end-to-end MDM implementation (from scope definition to implementation and training) for a client or organization Excellent communication, presentation, and interpersonal skills, with the ability to articulate technical concepts to both technical and non-technical audiences. Hands-on experience leveraging data governance tools and technologies to perform activities pertaining to data quality, data cataloging, lineage, etc. Experience with implementing data modeling concepts, semantic layer, star schema, data normalization Proven record of managing relationships with senior client stakeholders and technology partners Preferred qualifications: Professional training, certifications, or qualifications in Data Governance tools like: Informatica SaaS MDM Implementation Specialist Informatica CDI Informatica IDMC Informatica PowerCenter Informatica Data Governance Active participation in local industry associations in the domain of data management or governance Strong understanding of some regulatory compliance standards (e.g. GDPR, HIPAA, PIPEDA, PHIPA, HIA, BCBS-239 etc.) Strong understanding of data warehousing principles, data modeling techniques, and ETL processes
Posted 3 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
Hyderabad
Work from Office
CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39973 Jobs | Dublin
Wipro
19601 Jobs | Bengaluru
Accenture in India
16747 Jobs | Dublin 2
EY
15791 Jobs | London
Uplers
11569 Jobs | Ahmedabad
Amazon
10606 Jobs | Seattle,WA
Oracle
9430 Jobs | Redwood City
IBM
9385 Jobs | Armonk
Accenture services Pvt Ltd
8587 Jobs |
Capgemini
7916 Jobs | Paris,France