Jobs
Interviews

329 Etl Tools Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Notice Period : Immediate Joiners only Mandatory Skills : ETL Tester, ETL Tools, SQL, Data warehouse, Automic, Control-M, Test scenarios, Test case preparations, Test execution ,Agile Team Interview Availability : 6&7 Feb 2025 Face to Face Job Description : Primary Skills : - Strong understanding of Data warehouse concepts and ETL testing - Experience in automating ETL testing and Data testing using any ETL tool and scripting - Expertise building configurable ETL test automation suites to increase efficiency - Expertise building granular and reusable test cases to be used across different modules in a software line - Plans and defines testing approach and provides prioritization of testing activities to support risks in project timelines or test scenarios - Should have excellent analytical and problem-solving skills - Should have excellent requirement gathering and analysis skills to identify the gaps - Responsible for validating the data sources, extraction of data, applying transformation logic, and loading the data in the target tables - Experience on creating Test Scenarios & Test Case Preparation, Test Execution, Defects & status reporting - Responsible for writing complex SQL queries, Stored Procedures and functions - Responsible for troubleshooting defects, Logging defects, Retesting and Bug Closure - Experience of working in an Agile team - Experience with scheduling tools like Automic, Control - M - Work with multiple teams to define stories and acceptance criteria - Experience in JIRA and Q-Test Secondary Skills : - Concept knowledge of Cloud Computing Services like AWS - Concept knowledge of using APIs like REST - Having Unix scripting knowledge is an add on

Posted 2 months ago

Apply

6.0 - 10.0 years

20 - 30 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Lead o9 platform implementation, configure solutions, build Python plugins, integrate via SSIS/T-SQL, support demand/supply planning modules, perform fit-gap analysis, and drive end-to-end delivery in a hybrid work setup Required Candidate profile 6–8 years total experience with 4+ years in o9/SCM, strong in configuration, Python plugins, SSIS, T-SQL, IBPL, and SCM modules. Excellent communication and client interaction skills

Posted 2 months ago

Apply

7.0 - 12.0 years

15 - 19 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Job Title: SF Lead Developer Experience: 7-15 Years Location: Remote, Chennai, Hyderabad, Kolkata, Pune, Ahmedabad iSource Services is hiring for one of their client for the position of SF Lead developer. Job Description: We are looking for a skilled and experienced SF Lead Developer to join our team. The ideal candidate will have proficiency in Salesforce platform setup, configuration, and customization. The role will involve working on Salesforce declarative tools, data management, automation, and scripting, alongside Salesforce integration and data migration. Desired Candidate Profile: 7-15 years of experience in Salesforce development and platform expertise. Strong knowledge of Salesforce declarative tools and development skills in Apex and JavaScript. Strong understanding of data migration and integration processes. Excellent problem-solving, communication, and collaboration skills. Key Responsibilities: 1. Salesforce Platform Expertise: Proficient in Salesforce setup, configuration, and declarative customization. Strong knowledge of Salesforce declarative tools, including UI Configuration, Flow, and Validation Rules. Experience with Salesforce data modeling, object relationships, custom metadata types, and field types. Understanding of Salesforce security concepts such as data visibility hierarchy, profiles, roles, and sharing rules. 2. Data Management: Knowledgeable in data migration and integration tools (Data Loader, ETL tools). Ability to understand and apply data cleansing, de-duplication, and validation processes. Proficient in creating and managing reports and dashboards in Salesforce. 3. Automation and Scripting: (For Developer Consultant role) Experience with Salesforce automation tools (Apex triggers, Lightning Web Component, Batch Apex, Apex classes, etc.). Familiarity with scripting languages such as Apex and JavaScript. Preferred Certifications: Salesforce Certified Administrator Salesforce Certified Platform App Builder Salesforce Certified Platform Developer 1 Desired Candidate Profile: 7-15 years of experience in Salesforce development and platform expertise. Strong knowledge of Salesforce declarative tools and development skills in Apex and JavaScript. Strong understanding of data migration and integration processes. Excellent problem-solving, communication, and collaboration skills.

Posted 2 months ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Hyderabad

Hybrid

We are seeking a Senior Manager - Pricing Analytics for the pricing team in Thomson Reuters. Central Pricing Team works with Pricing Managers, Business Units, Product Marketing Managers, Finance and Sales in price execution of new product launches, maintenance of existing ones, and creation & maintenance of data products for reporting & analytics. The team is responsible for providing product and pricing information globally to all internal stakeholders and collaborating with upstream and downstream teams to ensure offer pricing readiness. Apart from BAU, the team works on various automation, pricing transformation projects & pricing analytics initiatives. About the Role In this role as a Senior Manager - Pricing Analytics , you will: Lead and mentor a team of pricing analysts, data engineers, and BI developers Drive operational excellence by fostering a culture of data quality, accountability, and continuous improvement. Manage team capacity, project prioritization, and cross-functional coordination with Segment Pricing, Finance, Sales, and Analytics teams Partner closely with the Pricing team to translate business objectives into actionable analytics deliverables. Drive insights on pricing performance, discounting trends, segmentation, and monetization opportunities. Oversee design and execution of robust ETL pipelines to consolidate data from multiple sources (e.g., Salesforce, EMS, UNISON, SAP, Pendo, Product usage platforms etc). Ensure delivery of intuitive, self-service dashboards and reports that track key pricing KPIs, sales performance, and customer behaviour. Strategize, deploy and promote scalable analytics architecture and best practices in data governance, modelling, and visualization. Act as a trusted advisor to Pricing leadership by delivering timely, relevant, and accurate data insights. Collaborate with analytics, finance, segment pricing and data platform teams to align on data availability, definitions, and architecture. Shift Timings: 2 PM to 11 PM (IST) Work from office for 2 days in a week (Mandatory) About You Youre a fit for the role of Senior Marketing Analyst, if your background includes: 10+ years of experience in analytics, data science, or business intelligence, with 3+ years in a people leadership or managerial role. Proficiency in SQL, ETL tools (e.g. Alteryx, dbt, airflow), and BI platforms (e.g., Tableau, Power BI, Looker) Knowledge of Python, R, or other statistical tools is a plus Experience with data from Salesforce, SAP, other CRM, ERP or CPQ tools Ability to translate complex data into actionable insights and communicate effectively with senior stakeholders. Strong understanding of data analytics, monetization metrics, and SaaS pricing practices Proven experience working in a B2B SaaS or software product company preferred MBA, Masters in Analytics, Engineering, or a quantitative field preferred

Posted 2 months ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Chennai

Hybrid

Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci / cd , zeppelin , pycharm , pyspark , etl tools,control-m,unit test cases,tableau,performance tuning , jenkins , qlikview , informatica , jupyter notebook,api integration,unix/linux,git,aws s3 , hive , cloudera , jasper , airflow , cdc , pyspark , apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop

Posted 3 months ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram

Work from Office

In one sentence We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP. The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL. What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to ensure seamless integration. Optimize performance and scalability of migrated databases. Document migration processes, tools, and best practices. Required Skills & Experience 5+ years in mainframe systems (COBOL, CICS, DB2, IMS, JCL, VSAM, Datacom). Proven experience in cloud migration (AWS DMS, Azure Data Factory, GCP Dataflow, etc.). Strong knowledge of ETL tools, data modeling, and schema conversion. Experience with PostgreSQL, Oracle, or other cloud-native databases. Familiarity with data governance, security, and compliance in cloud environments. Excellent problem-solving and communication skills.

Posted 3 months ago

Apply

6.0 - 8.0 years

5 - 8 Lacs

Mumbai

Hybrid

Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop

Posted 3 months ago

Apply

4.0 - 9.0 years

14 - 24 Lacs

Chennai, Bengaluru, Delhi / NCR

Work from Office

ETL+ GCP or SQL+GCP • Minimum 3+ years of application development experience. • Should be strong in any ETL tools • Should be strong in Shell Scripting • Should be very strong in SQL • Should have good hands on experience in GC P( Google Cloud Platform) • Exposure to ELT framework is an added advantage. Healthcare knowledge preferred Should be flexible to work during overlap hours up to 8:30 PM IST based on business requirements

Posted 3 months ago

Apply

7.0 - 12.0 years

7 - 17 Lacs

Bengaluru

Work from Office

In this role, you will: Act as an advisor to leadership to develop or influence applications, network, information security, database, operating systems, or web technologies for highly complex business and technical needs across multiple groups Lead the strategy and resolution of highly complex and unique challenges requiring in-depth evaluation across multiple areas or the enterprise, delivering solutions that are long-term, large-scale and require vision, creativity, innovation, advanced analytical and inductive thinking Translate advanced technology experience, an in-depth knowledge of the organizations tactical and strategic business objectives, the enterprise technological environment, the organization structure, and strategic technological opportunities and requirements into technical engineering solutions Provide vision, direction and expertise to leadership on implementing innovative and significant business solutions Maintain knowledge of industry best practices and new technologies and recommends innovations that enhance operations or provide a competitive advantage to the organization Strategically engage with all levels of professionals and managers across the enterprise and serve as an expert advisor to leadership Required Qualifications: 7+ years of Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Strong experience working in GCP-Cloud transformation and implementation Experience working with Abinitio, Informatica ETL Tools or equivalent, Teradata, Google BigQuery Well versed with Data Warehousing methodologies, dimensional modelling, Tuning Strong communication and interpersonal skills Cloud Strategy/Transition Lead the Google Cloud Platform (GCP) transformation and implementation initiative for Commercial Bank Partner with the architects & data leaders in coming up with cloud migration path Design and do POCs on cloud platforms for various usecases Evaluate and share the progress on cloud transition on a regular periodic basis Stay on top of the enterprise cloud strategy Data Environment Transformation/Simplification Lead the application transformation and rationalization initiatives Analyzes performance trends and recommends process improvements; assesses changes for risk to production systems and assures quality, security and compliance requirements Evaluate, incubate, adopt modern technologies and engineering practices and recommends innovations that provide competitive advantage to the organization Develop strategies to improve developer productivity and reduce technology debt Develop strategy to improve data quality and maintenance of data lineage Architecture Oversight Develops consistent architecture strategy and delivers safe, secure and consistent architecture solutions. Reduces technology risk by working closely with architect and design solutions that aligns to architecture roadmap, enterprise principles, policies and standards Partner with Enterprise cloud platform, CI/CD pipelines, platform teams, architects, engineering managers and the developer community. Job Expe ctations: Act as liaison between business and technical organization by planning, conducting, and directing the analysis of highly complex business problems to be solved with automated systems. Provides technical assistance in identifying, evaluating, and developing systems and procedures that are cost effective and meet business requirements Acts as an internal consultant within technology and business groups by using quality tools and process definition/improvement to re-engineer technical processes for greater efficiencies Candidate must be based out of posted location (Hyderabad/Bangalore) and will be required to work in the office as per organizations In Office Adherence / Return To Office (RTO)

Posted 3 months ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and performing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a crucial team member that assists in design and development of the data pipeline Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Must-Have Skills: Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Solid understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 3 months ago

Apply

7.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 3 months ago

Apply

4.0 - 5.0 years

8 - 12 Lacs

Hyderabad

Hybrid

Working as a developer for creating and testing applications on the Opentext Infoarchive tool; to work with the sql server management studio working as a developer for creating and testing applications on the Opentext Infoarchive toolRole & responsibilities

Posted 3 months ago

Apply

2.0 - 7.0 years

4 - 6 Lacs

Pune, Chennai

Work from Office

Perform ETL testing to verify data extraction, transformation, loading Validate data movement across systems, ensuring data consistency, quality Write complex SQL queries to validate data source, target Conduct data reconciliation, data profiling Required Candidate profile Collaborate with all Create and maintain test cases, test plans, and test reports Identify, log, and track defects and data issues Work with various DB systems (Oracle, SQL Server, PostgreSQL, etc.) Perks and benefits Perks and Benefits

Posted 3 months ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Work from Office

Responsibilities : - Design, develop, and maintain scalable and efficient ETL/ELT pipelines using appropriate tools and technologies. - Develop and optimize complex SQL queries for data extraction, transformation, and loading. - Implement data quality checks and validation processes to ensure data integrity. - Automate data pipelines and workflows for efficient data processing. - Integrate data from diverse sources, including databases, APIs, and flat files. - Manage and maintain data warehouses and data lakes. - Implement data modeling and schema design. - Ensure data security and compliance with relevant regulations. - Provide data support for BI and reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Collaborate with BI developers to ensure data availability and accuracy. - Optimize data queries and performance for reporting applications. - Provide technical guidance and mentorship to junior data engineers. - Lead code reviews and ensure adherence to coding standards and best practices. - Contribute to the development of technical documentation and knowledge sharing. - Design and implement data solutions on cloud platforms (AWS preferred). - Utilize AWS data integration technologies such as Airflow and Glue. - Manage and optimize cloud-based data infrastructure. - Develop data processing applications using Python, Java, or Scala. - Implement data transformations and algorithms using programming languages. - Identify and resolve complex data-related issues. - Proactively seek opportunities to improve data processes and technologies. -Stay up-to-date with the latest data engineering trends and technologies. Requirements : Experience : - 5 to 10 years of experience in Business Intelligence and Data Engineering. - Proven experience in designing and implementing ETL/ELT processes. - Expert-level proficiency in SQL (advanced/complex queries). - Strong understanding of ETL concepts and experience with ETL/Data Integration tools (Informatica, ODI, Pentaho, etc.). - Familiarity with one or more reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Knowledge of Python and cloud infrastructure (AWS preferred). - Experience with AWS data integration technologies (Airflow, Glue). - Programming experience in Java or Scala. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Proven ability to take initiative and be innovative. - Ability to work independently and as part of a team. Education : - B.Tech / M.Tech / MCA (Must-Have).

Posted 3 months ago

Apply

9.0 - 14.0 years

20 - 30 Lacs

Kochi, Bengaluru

Work from Office

Senior Data Engineer AWS (Glue, Data Warehousing, Optimization & Security) Experienced Senior Data Engineer (6+ Yrs) with deep expertise in AWS cloud Data services, particularly AWS Glue, to design, build, and optimize scalable data solutions. The ideal candidate will drive end-to-end data engineering initiatives — from ingestion to consumption — with a strong focus on data warehousing, performance optimization, self-service enablement, and data security. The candidate needs to have experience in doing consulting and troubleshooting exercise to design best-fit solutions. Key Responsibilities Consult with business and technology stakeholders to understand data requirements, troubleshoot and advise on best-fit AWS data solutions Design and implement scalable ETL pipelines using AWS Glue, handling structured and semi-structured data Architect and manage modern cloud data warehouses (e.g., Amazon Redshift, Snowflake, or equivalent) Optimize data pipelines and queries for performance, cost-efficiency, and scalability Develop solutions that enable self-service analytics for business and data science teams Implement data security, governance, and access controls Collaborate with data scientists, analysts, and business stakeholders to understand data needs Monitor, troubleshoot, and improve existing data solutions, ensuring high availability and reliability Required Skills & Experience 8+ years of experience in data engineering in AWS platform Strong hands-on experience with AWS Glue, Lambda, S3, Athena, Redshift, IAM Proven expertise in data modelling, data warehousing concepts, and SQL optimization Experience designing self-service data platforms for business users Solid understanding of data security, encryption, and access management Proficiency in Python Familiarity with DevOps practices & CI/CD Strong problem-solving Exposure to BI tools (e.g., QuickSight, Power BI, Tableau) for self-service enablement Preferred Qualifications AWS Certified Data Analytics – Specialty or Solutions Architect – Associate

Posted 3 months ago

Apply

8 - 12 years

10 - 14 Lacs

Mumbai, Delhi / NCR

Work from Office

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote), Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Type: Contract (8 -12 Months) Role Overview :We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data -driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions.Responsibilities :Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment.ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms.Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories.Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively.Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non -technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player.Qualifications :Experience : - 8 -12 years of hands -on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS.Soft Skills : - Strong analytical and problem -solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self -motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self -confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 4 months ago

Apply

12 - 17 years

17 - 22 Lacs

Mumbai

Work from Office

Role Description: The Solution Architect - SAP QM is responsible for applying business & technical expertise to design, implement and support the current business capabilities into SAP QM, related SAP modules. In this role, the candidate shall support global supply chain business team in various areas such as Supply Chain, Logistics, Shipping, Inventory Management, Production Supply, and related processes. The applicant should also have broad knowledge of SAP. The applicant should have the ability to support advanced business functions, translate complex business requirements into solution designs, and build and implement systems and technical solutions for operations. Job Requirements: Around 12+ years of experience in the SAP supply chain focused on SAP ERP design, implementation (atleast 4 end to end implementation project) and support. Business Process Knowledge of topics viz. Quality Planning, Quality inspection, Quality Control, Quality certificates, Quality Notifications, Stability Studies, Batch Management etc. Experience of integrating SAP QM with External system 3PLs like MES, Trackwise, LIMS, Labware, Lab Vantage etc., and with other SAP modules, GXP and SOX requirements Experience as Solution Architect in the multiple programs of global scale Experience in Business Blue Printing, Design, Prototyping, Functional Analysis, Configuration, Gap Analysis, Conversion, Migration, Testing, Training, Cutover, Go-Live and Post Go-live Support activities, driving Business process workshops and Fit/GAP analysis Deep Knowledge and expertise in Quality Management BPML Good experience in Data migration process and ETL Tools Domain experience in Quality Management with S/4HANA certification (preferred). Experience in Deal Pricing & Transformation Deal Estimations Should have experience of Change Management, Task Planning, Project Reporting, Resource Management, Process Improvement, Supervising. Strong relationship building skills Job Responsibilities : Establish relationships with Supply Chain (primarily Logistics, Warehouse, Shipping but also other Operations functions and Quality) and super users globally. Lead the engagement efforts at different stages from Problem definition to diagnosis to solution design, development & deployment, contribute to unit level & Organizational initiatives. Design, Build & Optimize End to End SAP Quality Management Processes for customers enterprise. Collaborate with business users in Supply Chain Dept. and other departments when needed, to gain a deep understanding of their business processes and requirements. Design, develop, and test system solutions to address business requirements, in alignment with the global solution template. Collaborate with business teams globally to gain a deep understanding of business processes, requirements and develop, test the processes to address business requirements. Develop functional specifications for custom developments and collaborate with the development team to get those built and tested.

Posted 4 months ago

Apply

4 - 8 years

15 - 25 Lacs

Gurugram

Hybrid

Submit your application here : https://saarthee.keka.com/careers/jobdetails/8544 At Saarthee, we are on a mission to drive actionable insights and real-world impact through data. As a fast-growing, women-owned analytics consulting firm, we work with some of the most respected global clients, helping them solve high-impact problems through a combination of cutting-edge analytics, industry experience, and a culture of ownership and excellence. We are looking for Analytics Leads who can combine deep analytical thinking with strong project and people management skills. This is a client-facing role where youll lead complex, cross-functional data projects, work closely with global stakeholders, and act as the bridge between business strategy and data execution. The ideal candidate is hands-on, analytical, solutions-oriented, and equally comfortable managing technical teams, engaging with clients, and driving results. You will be working closely with the Founders and leadership across Saarthee's Onsite and Offshore teams, ensuring high-quality delivery and client success. Key Responsibilities Independently lead client engagements and manage global analytics delivery teams across time zones. Build and maintain strong client relationships, acting as a strategic advisor to institutionalize data-driven decision-making. Identify opportunities to create business impact through data and drive end-to-end project execution. Use project management tools to lead stakeholder discussions, track deliverables, manage risks, and ensure timely outcomes. Deliver high-quality analytics solutions including data modeling, insights generation, and actionable recommendations aligned to client KPIs. Design and implement robust analytics and reporting platforms, including tool selection (ETL, database, visualization). Consult on advanced statistical models and machine learning strategies where applicable. Guide, mentor, and grow a team of consultants; lead knowledge sharing and capability-building initiatives. Collaborate closely with internal teams for smooth project ramp-ups, analyst deployment, and quality assurance. Required Skills & Qualifications 3+ years of hands-on experience in data analytics, project management, or client engagement roles. Strong analytical background with proficiency in SQL, Tableau, Python, or similar tools. (Experience with Knime is a plus). Proven experience leading pilot projects, managing multiple workstreams, and solving open-ended business problems. Excellent communication, stakeholder management, and critical thinking skills. Ability to operate in fast-paced, ambiguous environments and deliver high-quality outcomes independently and as a team leader. What We Offer Competitive compensation packages that reward performance and ownership. Accelerated career growth in a startup environment backed by structured mentorship and leadership exposure. A collaborative and high-performance culture built on trust, empathy, and results. Exposure to global clients and strategic business challenges across industries. Comprehensive health insurance and wellness benefits. An inclusive, people-first workplace that values continuous learning, innovation, and transparency. About Saarthee: Saarthee is global analytics consulting firm unlike any other, where our passion for helping others fuels our approach and our products and solutions. We are a one-stop shop for all things data and analytics. Unlike other analytics consulting firms that are technology or platform specific, Saarthees holistic and tool agnostic approach along with ability delivery strategic actionable insights is unique in the marketplace. Our Analytics Value Chain framework meets our customers where they are in their data journey. Our diverse and global team of skilled data engineers, data analysts, and data scientists work with one objective in mind: Our Customers Success. At Saarthee, we are passionate about guiding organizations towards insights-fueled success. Thats why we call ourselves Saartheeinspired by the Sanskrit word Saarthi, which means charioteer, trusted guide, or companion. Co-founded in 2015 by Mrinal Prasad and Shikha Miglani, Saarthee already encompasses all the components of Data Analytics consulting. Saarthee is based out of Philadelphia, USA with office in UK and India. At Saarthee, we dont just solve data problems we shape careers, unlock business value, and build future-ready leaders. If you are ready to lead from the front and make an impact, wed love to hear from you.

Posted 4 months ago

Apply

8.0 - 12.0 years

16 - 20 Lacs

mumbai

Work from Office

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted Date not available

Apply

8.0 - 12.0 years

19 - 22 Lacs

hyderabad

Remote

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted Date not available

Apply

8.0 - 12.0 years

19 - 22 Lacs

gurugram

Remote

Location: India (Remote) Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted Date not available

Apply

8.0 - 12.0 years

19 - 22 Lacs

bengaluru

Remote

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted Date not available

Apply

8.0 - 12.0 years

19 - 22 Lacs

chennai

Remote

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted Date not available

Apply

5.0 - 10.0 years

9 - 14 Lacs

hyderabad

Work from Office

Responsibilities : - Design, develop, and maintain scalable and efficient ETL/ELT pipelines using appropriate tools and technologies. - Develop and optimize complex SQL queries for data extraction, transformation, and loading. - Implement data quality checks and validation processes to ensure data integrity. - Automate data pipelines and workflows for efficient data processing. - Integrate data from diverse sources, including databases, APIs, and flat files. - Manage and maintain data warehouses and data lakes. - Implement data modeling and schema design. - Ensure data security and compliance with relevant regulations. - Provide data support for BI and reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Collaborate with BI developers to ensure data availability and accuracy. - Optimize data queries and performance for reporting applications. - Provide technical guidance and mentorship to junior data engineers. - Lead code reviews and ensure adherence to coding standards and best practices. - Contribute to the development of technical documentation and knowledge sharing. - Design and implement data solutions on cloud platforms (AWS preferred). - Utilize AWS data integration technologies such as Airflow and Glue. - Manage and optimize cloud-based data infrastructure. - Develop data processing applications using Python, Java, or Scala. - Implement data transformations and algorithms using programming languages. - Identify and resolve complex data-related issues. - Proactively seek opportunities to improve data processes and technologies. -Stay up-to-date with the latest data engineering trends and technologies. Requirements : Experience : - 5 to 10 years of experience in Business Intelligence and Data Engineering. - Proven experience in designing and implementing ETL/ELT processes. - Expert-level proficiency in SQL (advanced/complex queries). - Strong understanding of ETL concepts and experience with ETL/Data Integration tools (Informatica, ODI, Pentaho, etc.). - Familiarity with one or more reporting tools (Microstrategy, PowerBI, Tableau, Jaspersoft, etc.). - Knowledge of Python and cloud infrastructure (AWS preferred). - Experience with AWS data integration technologies (Airflow, Glue). - Programming experience in Java or Scala. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Proven ability to take initiative and be innovative. - Ability to work independently and as part of a team. Education : - B.Tech / M.Tech / MCA (Must-Have).

Posted Date not available

Apply

8.0 - 13.0 years

10 - 14 Lacs

bengaluru

Work from Office

Role Description A Change Analyst within the Change & Transformation team plays a significant role in ensuring projects (change initiatives) meet objectives on time. This person will focus on business processes changes, systems and technology. The primary responsibility will be creating and implementing change management strategies and plans that maximize in achieving organisation goals and minimize resistance. The change Analyst will work to drive faster adoption, higher ultimate utilization, implement within timelines set and proficiency with the changes that impact process. These improvements will increase benefit realization, value creation, ROI and the achievement of results and outcomes. About the organisation Deutsche Banks Operations group provides support for all of DBs businesses to enable them to deliver operational transactions and processes to clients. Our people work in established global financial centres such as London, New York, Frankfurt and Singapore, as well as specialist development and operations centres in locations including Birmingham, Jacksonville, Bangalore, Jaipur, Pune, Dublin, Bucharest, Moscow, and Cary. We move over EUR 1.6 trillion across the Banks platforms, support thousands of trading desks and enable millions of banking transactions, share trades and emails every day. Our goal is to deliver world-class client service at exceptional value to internal partners and clients. A dynamic and diverse division, our objective is to make sure that all our services are executed in a timely and professional manner, that risk is minimised and that the client experience is positive. We are proud of the professionalism of our people, and the service they deliver. In return, we offer career development opportunities to foster skills and talent. We work across a wide range of product groups, including derivatives, securities, global finance and foreign exchange, cash and trade loans and trust and securities services as well as cross-product functions. Operations interface with Regulatory and Tax is a growing area of interest and helps Deutsche Bank to be compliant at all times. About Client Data Management (CDM) Client Data Management (CDM) function comprises of Client Data, Tax & Regulatory teams (including Instrument Reference Data). The group provides operational services across the Global Markets and Corporate Investment Banking (CIB) clients globally, which enable client business, regulatory and tax compliance, protect against client lifecycle risk and drive up data standards within the firm. The CDM function is focused on driving compliance within operations. The primary focus of this Client data; which has a significant impact on how we perform on-boarding and KYC of our customers, maintenance of client accounts and downstream operations. About the Team The Client Data Management & Transformation team supports the Operations Ref Data services for change management and delivering transformation related initiatives. You will be the interface between senior stakeholders, RTB SMEs, IT developers and analytics team to analyse & implement system changes, monitor the JIRA/Incident management and implement transformation initiatives. You will be part of the team that specializes in providing solutions to complex process/applications problems of the division and help extract business intelligence. Our CDM Change & Transformation team are working with cutting-edge technology to transform the way that we work. Youll be working on innovative projects involving transformation techniques. Your key responsibilities Responsible for capturing and refining business and/or system requirements Works with stakeholders to understand their needs, analyse problems, and capture their requirements, and then working closely with the Development Team to refine the requirement into specifications that can be executed by the team Possess a working knowledge of the business and/or technical domain in reference data Leverages experience and understanding of stakeholder need to help create a solution, envisions the solution to solve a problem (Application/Tool Based) Gathers & catalogues functional, non-functional, and technical requirements for stakeholder requests Determines the impact of modifications and enhancements on the application Specify the workflow & systems enhancements to satisfy business & reporting needs Perform data analysis, design of data architecture requirements and mapping Acts as the product(s) subject matter expert to support scope and requirement decisions Ensures changes to the application are compliant with bank standards and policies Assists users and development team with application testing and troubleshooting, and may help configure test solutions to validate functional and system needs Identifies, document and troubleshoot application related problems. Documents process, procedures and workflows associated with applications Participate in continuous improvement efforts; Building expertise in creating, analyzing, and improving processes Creates, maintains and presents training materials for end-users Leading the model definition process and serving as a business analyst to collect and analyze data for use in our analytical models Effective implementation of all the projects assigned and take complete ownership of the deliverables Manage projects in JIRA and enable efficient project management Soliciting feedback from stakeholders throughout the development cycle Presenting visualizations to stakeholders and senior management Working with RTB SMEs and technology resource to design systematic change Manage stakeholders Track and report issues Define and measure success metrics and monitor change progress Support change management at the organizational level Manage the change portfolio Your skills and experience 8+ years of relevant experience with project management and business analysis. Overall experience 8+ years Good Knowledge of Python, Alteryx & Tableau Reference Data domain expertise is mandatory Experience with JIRA and other project management tools. Ability to create JIRA dashboards and run sprints Experience in Data Quality projects using any ETL tools Knowledge of SQL is required Strong communication skills and stakeholder management skills Ability to translate business requirements to technical requirements and elaborate to data analysts. Experience in creating business requirement document, solution design document etc. Experience with running Steering Committee calls with senior management and help in prioritization Exposure to Data Analytics requirements, Big Data exposure will be helpful Strong excel skills to create reports etc. and understanding of basic statistical and data mining approaches and terms

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies