Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Number of Openings 2 ECMS ID in sourcing stage 533329 Assignment Duration 6 months Total Yrs. of Experience 5+ Relevant Yrs. of experience 5+ Detailed JD (Roles and Responsibilities) Strong expertise in data warehouse and ETL testing concepts Expertise in automation testing and having knowledge of Python Strong experience in defining requirement analysis, test strategy, test planning, test case preparation, reviews, test execution and test result reporting Strong SQl skills Mandatory skills Python, SQL, DWH/ETL Testing Desired/ Secondary skills Cucumber, BDD framework Max Vendor Rate in Per Day (Currency in relevance to work location) INR 6500/Day Work Location given in ECMS ID Bangalore, Chennai, Gurgaon BG Check (Before OR After onboarding) Before onboarding Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Mumbai
Work from Office
The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Snowflake Experience : 5-8 Years .
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: ETL Testing Experience : 5-8 Years .
Posted 3 weeks ago
10.0 - 12.0 years
35 - 40 Lacs
Pune
Work from Office
Seeking Data Architect with strong ETL, data modeling, data warehouse, SQL, Pyspark and Cloud experience. Architect experience is mandatory. Looking for only Immediate to currently serving candidates.
Posted 3 weeks ago
1.0 - 3.0 years
2 - 6 Lacs
Chennai
Work from Office
Develop and execute test plans and cases to ensure software quality, identifying and reporting defects. Collaborate with developers to resolve issues, participate in code reviews, and maintain test documentation. Contribute to improving the QA process by applying testing best practices and utilizing bug tracking systems within the SDLC. Key Responsibilities Develop and execute test cases and test plans. Identify and report software defects. Perform functional, regression, and performance testing. Collaborate with developers to resolve issues. Participate in code reviews and provide feedback on testability. Document test results and maintain test documentation. Learn and apply software testing best practices. Work with bug tracking systems. Understand software development lifecycle (SDLC). Assist in creating and maintaining automated test scripts. Familiarity with testing tools and frameworks. Ability to analyze and interpret test results. Basic understanding of different testing methodologies. Contribute to improving the QA process. Follow project testing standards. Qualifications Extensive experience in ETL, data warehousing, and BI reporting testing. Proficiency in SQL, Python for automation, and Azure Data Bricks. Strong understanding of relational databases and XML. Experience with test automation, Agile/Waterfall methodologies, and Atlassian tools. Excellent communication and problem-solving skills
Posted 3 weeks ago
2.0 - 6.0 years
5 - 9 Lacs
Ahmedabad
Work from Office
Primary Skill: Abinitio, Oracle PL/SQL, Index Management Secondary Skill: Unix Commands and Shell scripting Primary Skill : Abinitio, Oracle PL/SQL, Index Management Secondary Skill : Unix Commands and Shell scripting Hands-on experience in Abinitio Graph / Plan Development, Parallel processing, Debugging, Air Commands, Ab InitioFile System Hands-on experiencein Oracle PL/SQL, Index Management Unix Commands, File management, process monitoring, network interfaces and Shell scripting Should be knowledgeable in Unix Commands and Shell scripting Understanding of QA within Software Development environment Logical analysis skills and problem-solving Proven ability to work to deadlines. Consistently demonstrates clear and concise written and verbal communication skills Ability to work under own initiative or as part of a team. Experience in Designing & Executing test cases Selenium Automation good to have Hands-on experience in Abinitio Graph / Plan Development, Parallel processing, Debugging, Air Commands, Ab Initio File System Hands-on experience in Oracle PL/SQL, Index Management Unix Commands, File management, process monitoring, network interfaces and Shell scripting Should be knowledgeable in Unix Commands and Shell scripting Understanding of QA within Software Development environment Logical analysis skills and problem-solving Proven ability to work to deadlines. Consistently demonstrates clear and concise written and verbal communication skills Ability to work under own initiative or as part of a team. Experience in Designing & Executing test cases Selenium Automation good to have
Posted 3 weeks ago
2.0 - 7.0 years
7 - 17 Lacs
Hyderabad
Work from Office
About the Role: Wells Fargo is seeking a Data Management Analyst. This is an Offshore IC role within FTO and is part of eIDF Instrument Build Team. Build involve Techno-Functional activities revolving around data modeling & custom designs, integration of different source systems, ETL optimizations etc. In this role, you will: Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 2+ years of experience in Financial/Risk/ Regulatory data handling & reporting 2+ years of experience in Financial Data Mapping & Modeling Qualification MBA/B.Tech/B.E/ MCA Good understanding of Data-warehousing Fundamentals and experience in Source-Target Mapping, Lineage Tracing, Data Governance Sound in SQL, Big Data Query Techniques Experience of working in Agile Methodology and JIRA Knowledge & good working exposures with one or more of the following: Regulatory Reporting, Credit & Market Risks, Funds Transfer Pricing, Asset-Liability Management Understanding of Banking Products Job Expectations: Good Communication & Presentation Skills Traits of Ownership & ability to drive deliverables with minimal guidance Result-Oriented mindset and demonstrating traits of being thoughtful & innovative Ability to inculcate and follow good Team Dynamics/Team Hygiene Knowledge & good working exposures with one or more of the following: Regulatory Reporting, Credit & Market Risks, Funds Transfer Pricing, Asset-Liability Management Understanding of Banking Products.
Posted 3 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed JD (Roles and Responsibilities) Strong expertise in data warehouse and ETL testing concepts Expertise in automation testing and having knowledge of Python Strong experience in defining requirement analysis, test strategy, test planning, test case preparation, reviews, test execution and test result reporting Strong SQl skills Mandatory skills Python, SQL, DWH/ETL Testing Desired/ Secondary skills Cucumber,BDD framework
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Ajmera Infotech builds planet-scale software for NYSE-listed clients, driving decisions that cant afford to fail. Our 120-engineer team specializes in highly regulated domainsHIPAA, FDA, SOC 2and delivers production-grade systems that turn data into strategic advantage. Why Youll Love It End-to-end impact Build full-stack analytics from lake house pipelines to real-time dashboards. Fail-safe engineering TDD, CI/CD, DAX optimization, Unity Catalog, cluster tuning. Modern stack Databricks, PySpark, Delta Lake, Power BI, Airflow. Mentorship culture Lead code reviews, share best practices, grow as a domain expert. Mission-critical context Help enterprises migrate legacy analytics into cloud-native, governed platforms. Compliance-first mindset Work in HIPAA-aligned environments where precision matters. Requirements Key Responsibilities Build scalable pipelines using SQL, PySpark, Delta Live Tables on Databricks. Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. Deliver robust Power BI solutions dashboards, semantic layers, paginated reports, DAX. Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. Document everything from pipeline logic to RLS rulesin Git-controlled formats. Collaborate cross-functionally to convert product analytics needs into resilient BI assets. Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills 5+ years in analytics engineering, with 3+ in production Databricks/Spark contexts. Advanced SQL (incl. windowing), expert PySpark , Delta Lake , Unity Catalog . Power BI mastery DAX optimization, security rules, paginated reports. SSRS-to-Power BI migration experience (RDL logic replication). Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). Communication skills to bridge technical and business audiences. Nice-to-Have Skills Databricks Data Engineer Associate cert. Streaming pipeline experience (Kafka, Structured Streaming).
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
End-to-end impact Build full-stack analytics from lake house pipelines to real-time dashboards. Fail-safe engineering TDD, CI/CD, DAX optimization, Unity Catalog, cluster tuning. Modern stack Databricks, PySpark, Delta Lake, Power BI, Airflow. Mentorship culture Lead code reviews, share best practices, grow as a domain expert. Mission-critical context Help enterprises migrate legacy analytics into cloud-native, governed platforms. Compliance-first mindset Work in HIPAA-aligned environments where precision matters. Requirements Key Responsibilities Build scalable pipelines using SQL, PySpark, Delta Live Tables on Databricks. Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. Deliver robust Power BI solutions dashboards, semantic layers, paginated reports, DAX. Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. Document everything from pipeline logic to RLS rulesin Git-controlled formats. Collaborate cross-functionally to convert product analytics needs into resilient BI assets. Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills 5+ years in analytics engineering, with 3+ in production Databricks/Spark contexts. Advanced SQL (incl. windowing), expert PySpark , Delta Lake , Unity Catalog . Power BI mastery DAX optimization, security rules, paginated reports. SSRS-to-Power BI migration experience (RDL logic replication). Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). Communication skills to bridge technical and business audiences. Nice-to-Have Skills Databricks Data Engineer Associate cert. Streaming pipeline experience (Kafka, Structured Streaming). dbt , Great Expectations , or similar data quality frameworks. BI diversityexperience with Tableau, Looker, or similar platforms.
Posted 3 weeks ago
15.0 - 20.0 years
25 - 30 Lacs
Chennai, Bengaluru
Work from Office
Our client is a global banking firm which provides industry-focused services for clients across geographies. We are currently looking for a seasoned finance leader to manage the capital requirements of the Corporate Investment Banking (CIB) division in Bangalore. Please contact Apoorva Sharma or email your cv directly in word format with Job ID: 15199 to . Apply for this Job Key responsibilities Reviewing, analyzing, and finalizing monthly CIB RWA (Risk-Weighted Assets) and leverage reporting, ensuring accuracy and completeness. Identifying gaps in capital processes and driving corrective actions to enhance data quality and integrity. Designing and maintaining high-quality MIS reports for RWA and leverage on a monthly and daily basis. Engaging with Risk, Operations, Technology, and Finance teams to track, remediate, and log RWA issues. Providing subject matter expertise on PRA regulatory guidelines and Basel 3.0 / 3.1 capital requirements for CIB products. Role requirements 15+ years of experience in Business Finance and Risk Management within the banking or financial services sector. Deep understanding of CIB products across Trade, Markets, and Banking, along with Credit Risk RWA calculations. Proven expertise in UK regulatory (PRA) guidelines and Basel capital frameworks. Excellent communication, presentation, and stakeholder management skills across senior business and risk partners. High proficiency in MS Office, data warehouse tools, and financial reporting systems.
Posted 3 weeks ago
2.0 - 4.0 years
6 - 7 Lacs
Mumbai
Work from Office
CRISIL is looking for Database Developer to join our dynamic team and embark on a rewarding career journeyThe developer should be proficient in database design, programming languages, and SQL.The key responsibilities of a Database Developer may include:1.Developing database solutions to store and manage large amounts of data.2.Creating database schemas that represent and support business processes.3.Optimizing database performance by identifying and resolving issues with indexing, query design, and other performance-related factors.4.Developing and maintaining database applications and interfaces that allow users to access and manipulate data.A successful Database Developer should have strong technical skills, including proficiency in database design, programming languages, and SQL. They should have experience working with large and complex data sets and knowledge of relational databases and SQL. The developer should also have experience with database management systems, such as Oracle, MySQL, or SQL Server.
Posted 3 weeks ago
1.0 - 8.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Description Position: Oracle Analytics Cloud (FDI/OAC) Technical Lead Overview: Recognized on the Inc. 5000 fastest growing companies in the US, Peloton is one of the largest and fastest growing professional services firms specializing in Integrated Cloud Solutions for Enterprise Resource Planning, Enterprise Performance Management, Supply Chain Management, Human Capital Management and Big Data and Analytics. Peloton has the vision and connects capabilities to help CFOs, CIOs and business leaders to envision, implement and realize the benefits of digital transformation. Companies that are equipped with the right information, have the know-how, and the enabling technology to consistently leverage analytics will gain a competitive advantage. Our people are recognized as some of the best minds and most committed people in the industry. We believe in quality. We appreciate creativity. We recognize individual contributions, and we place trust in our team members. And we love what we do. Peloton provides Advisory, Consulting, and Managed services with deep functional and technical expertise specializing in serving clients in the Life Sciences, Retail, Manufacturing, Insurance, Aerospace and Defense and Financial Services industries. Our business and technology professionals provide a unique perspective, proven experience, with an innovative and collaborative approach to achieve results for clients. Our Technical Services practice is growing, and we are looking to add an additional leader to join the team. The ideal candidate possesses a solid foundation of core consulting skills, strong Oracle Analytics skills, functional knowledge and technical expertise in designing and developing Analytics applications. Responsibilities: Responsibilities will vary depending on the level and experience of the individual. The consultant will work as part of a project team to deliver analytical, solution-oriented services to Fortune 1000 clients. Based upon experience, specific responsibilities may include: Developing an understanding of a client s current state process and developing future state recommendations Recommending road maps to close performance gaps and developing high level implementation plans Gathering and analyzing business requirements Aligning business requirements and best practices to implement a technical solution Defining new and refining existing business processes Contributing to continuous improvement and development of Peloton processes and intellectual property. Responsibilities: The Analytics Technical Lead will be responsible for working with clients to define and develop compelling Oracle Analytics applications (visualizations, Dashboards, Data Models) Designs and develops data sources to be used for multiple users in an enterprise Architects and designs data sources, dashboards, ETL Processes, and security models to be deployed in an Oracle Analytics and Oracle Autonomous Datawarehouse environments Develops best practices and standards for Oracle Analytics, Fusion Data Intelligence, and Oracle Autonomous Datawarehouse (ADW). Stays current and advises clients on trends in analytics and data architectures, with Oracle as well as other technology and vendors. Required Experience Skills: Qualified candidates must have a BS or BA degree in Computer Science, Mathematics, Management Information Systems (MIS) or equivalent Experience performing complex data analysis and presentation of findings to client stakeholders and executives 4-8 years of experience leading implementations of Oracle Analytics Cloud and/or Autonomous Data Warehouse, Oracle Data Integrator 1-2 years of implementation experience with Fusion Data Intelligence 1-2 years experience with Fusion, BI Publisher OTBI Reporting. Excellent analytical and problem-solving skills Strong written and verbal communication skills Eagerness to work in a team-oriented environment Compensation: Competitive salary Performance Bonus Medical and Dental Insurance Vision Insurance Discounts on Pet Insurance Group Accident and Life Insurance Paid holidays and vacation days Peloton Group is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. #Ll-SV1 #Ll-HYBRID
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Database Architecting Experience : 5-8 Years.
Posted 3 weeks ago
8.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.
Posted 3 weeks ago
5.0 - 10.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Manage and support the Delivery Operations Team by implementing and supporting ETL and automation procedures. Schedule and perform delivery operations functions to complete tasks and ensure client satisfaction. ESSENTIAL FUNCTIONS: Process data conversions on multiple platforms Perform address standardization, merge purge, database updates, client mailings, postal presort. Automate scripts to perform tasks to transfer and manipulate data feeds internal and external. Multitask ability to manage multiple Jobs to ensure timely client deliverability Work with technical staff to maintain and support an ETL environment. Work in a team environment with database/crm, modelers, analysts and application programmers to deliver results for clients. REQUIRED SKILLS: Experience in database marketing with the ability to transform and manipulate data. Experience with Oracle and SQL to automate scripts to process and manipulate marketing data. Experience with tools such as DMexpress, Talend, Snowflake, Sap DQM suite of tools, excel. Experience with Sql Server : Data exports and imports, ability to run Sql server Agent Jobs and SSIS packages. Experience with editors like Notepad++, Ultraedit, or any type of editor. Experience in SFTP and PGP to ensure data security and protection of client data. Experience working with large scale customer databases in a relational database environment. Proven ability to work on multiple tasks at a given time. Ability to communicate and work in a team environment to ensure tasks are completed in a timely manner MINIMUM QUALIFICATIONS: Bachelor's degree or equivalent 5+ years experience in Database Marketing. Excellent oral and written communication skills required.
Posted 3 weeks ago
2.0 - 4.0 years
4 - 5 Lacs
Hyderabad
Work from Office
Job description As a key member of the data team, you will be responsible for developing and managing business intelligence solutions, creating data visualizations, and providing actionable insights to support decision-making processes. You will work with the stakeholders closely to understand the business requirements, translate them into technical tasks and develop robust data analytics and BI solutions while ensuring high-quality deliverables and client satisfaction. Desired Skills and Experience Essential skills 3-4 years of experience in development and deployment of high-performance, complex Tableau dashboards Experienced in writing complicated tableau calculations and using Alteryx solutions transformation to arrive at the desired solution. SQL skills and solid expertise in database design principles with data warehousing concepts In-depth knowledge of data platforms like SQL server and similar platforms. Strong understanding of data models in business intelligence (BI) tools such as Tableau, and other analytics tools. Problem-solving skills with a track record of resolving complex technical issues. Exposure to Tableau server and ETL concepts Tableu and Alteryx certifications in data-related fields are preferred. Strong communication skills, both written and oral, with a business and financial aptitude. Design, develop, and maintain Alteryx workflows for data processing and reporting. Extract data from SAP systems using DVW Alteryx Connectors and other integration tools. Perform ETL operations using Alteryx and manage datasets within MS SQL Server. Collaborate with stakeholders to understand data requirements and deliver actionable insights. Create and maintain documentation for workflow, processes, and data pipelines. Independently manage assigned tasks and ensure timely delivery. Education: Bachelors or masters in science or engineering disciplines (Computer Science, Engineering, Maths, Physics, etc.) Key Responsibilities Develop impactful and self-serving Tableau dashboards and visualizations to support business functions Conduct in-depth data analysis, extracting insights and trends from the data warehouse using SQL. Fulfill ad-hoc data requests and create insightful reports for business stakeholders. Contribute to data documentation and data quality efforts from a reporting perspective. Translate business questions into data-driven answers and actionable recommendations. Optimize BI solutions for performance and scalability. Incorporate feedback from clients and continuously improve models, dashboards, and processes. Create and maintain comprehensive documentation covering Data Fabric architecture, processes, and procedures Strategize and ideate the solution design, develop visual mockups, storyboards, flow diagrams, wireframes and interactive prototypes Comfortable working with large, complex datasets and conducting data quality checks, validations, and reconciliations. Key Skills Tableau Desktop, Tableau Server, Alteryx Designer, Alteryx Server , Tableau Reporting, Dashboard Visualization, data Warehousing, data pipeline, MS SQL server , ETL, business Analysis, Agile Methodolgy, data management, troubleshooting , data modelling, digital transformation, data integration Key Metrics Strong hands-on experience in solutioning and deploying analytics visualization solutions using Tableau Strong hands-on knowledge of MS SQL Server , data warehousing and data pipelines Exposure to Tableau Server and Alteryx server. Tableau Desktop certified or Alteryx Designer certified is a puls.
Posted 3 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Chennai, Coimbatore, Bengaluru
Work from Office
Primary Skill: ETLTesting Secondary Skill:Azure 5+ years data warehouse testing experience, 2+ years of Azure Cloud experience. Strong understanding of data marts and data warehouse concepts Expert in SQL with the ability to create source-to-target comparison test cases in SQL Creation of test plans, testcases, traceability matrix, closure reports Proficient with Pyspark, Python, Git, Jira, JTM Location : Pune, Chennai, Coimbatore, Bangalore Band - B2 and B3 Mandatory Skills: ETL Testing. Experience: 3-5 Years.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 Title Technical Specialist Department Technology - Corporate Enablers (CFO Technology) Location Gurgaon / Bengaluru (Bangalore) India Reports To Senior Manager Level 4 About your team The Corporate Enablers technology function provides IT services to multiple business functions like Finance, HR and General Counsel, globally. CFO Technology collaborates with Finance and Procurement stakeholders globally to develop and support business applications that underpin all core finance processes across FIL. This includes on-premises and SaaS solutions, both in-house built and vendor provided. There is a strong focus on data analytics, workflow and automation tools to bring greater efficiency to these functions. Together with this there is continued move towards greater use of Agile and DevOps practices. CFO Technology team is a global team with people based in the UK, China, and India. About your role Join our team of enthusiastic technologists as we build the future of cloud-based Integration platform We are seeking a skilled and experienced Full stack Developer to join our team. The ideal candidate will have a strong background in API development and PLSQL Store procedures along with good understanding of Kubernetes,AWS,SnapLogic cloud-native technologies. This role requires deep technical expertise and the ability to work in a dynamic and fast-paced environment. About you Essential Skills Minimum 7 years of overall full stack (Python, Oracle/PLSQL) hands on experience of system software development, testing and maintenance Knowledge of latest Python frameworks and technologies (e.g., Django, Flask, FastAPI) Experience with Python libraries and tools (e.g., Pandas, NumPy, SQLAlchemy) Strong experience in designing, developing, and maintaining RESTful APIs. Familiarity with API security, authentication, and authorization mechanisms (e.g., OAuth, JWT) Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors) Experience in development & low-level design of Warehouse solutions Familiarity with Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation and Oracle performance optimisation techniques Hands on development experience of AWS (S3, lambda, api gateway, EC2, CloudFront, Route53, Dynamo DB, vpc, subnets) Hands-on experience with Kubernetes for container orchestration Knowledge of deploying, managing, and scaling applications on Kubernetes clusters Should be able to provide technical design and architecture independently for business solutions Experience with cloud architecture and design principles, micro-services Good understanding of infra-aspects of technical solutions like storage, platform, middleware Should have clear understating on continuous integration, build, release, code quality Good understating of load balancing, disaster recovery aspects of solutions Good knowledge on security aspects like authentication, authorization by using open standards like oAuth Hands on with coding and debugging. Should be able to write high quality code optimized for performance and scale Good analytical- problem solving skills and should be good with algorithms Skills nice to have Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with SnapLogic cloud-native integration platform. Ability to design and implement integration pipelines using SnapLogic. Experience in AI prompt engineering, Gen AI, LLM models , Agents Experience in CI/CD, TDD, DevOps, CI/CD tools - Jenkins/UrbanCode/SonarQube/ Bamboo Key Responsibilities Lead and guide a team of developers/senior developers Architect technical design of the application, document and present it to senior stakeholders Interact with senior architects and other consultants to understand and review the technical solution and direction Communicate with business analysts to discuss various business requirements Proactively refactor code/solution, be aggressive about tech debt identification and reduction Develop, maintain and troubleshoot issues; and take a leading role in the ongoing support and enhancements of the applications Help in maintaining the standards, procedures and best practices in the team. Also help his team to follow these standards. Prioritisation of requirements in pipeline with stakeholders Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 8-10 years of experience with application development on Python language, API development along with Oracle RDBMS, SQL, PL/SQL Must have led a team of developers Feel rewarded For starters, we will offer you a comprehensive benefits package. We will value your wellbeing and support your development. And we will be as flexible as we can about where and when you work finding a balance that works for all of us. It is all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. Our Values Integrity Doing the right thing, every time and putting the client first. Trust Empowering each other to take the initiative and make good decisions. Our Behaviours Brave - Challenging the status quo, being accountable and speaking up. Bold - Acting with conviction, encouraging diverse thinking, and keeping things simple. Curious - Learning to do new things in better ways and encouraging fresh thinking. Compassionate - Having empathy, caring for colleagues, clients
Posted 3 weeks ago
2.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies
Posted 3 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Database Administration Skills, Knowledge of Datawarehouse / Data Lake, Good SQL Development skills Working knowledge of Tables, Views, Schemas, Procedures, functions, triggers, CTE, Cursor, security (encrypt / decrypt), logs, data structures, database sizing, data integration and migration strategies, Import - Export Managing and tracking successful database software Installation, Patch Management and Maintenance Developing, managing, monitoring and maintaining data ETL and ELT jobs on Microsoft SQL Server and Oracle databases Working knowledge of Azure Data Factory & Synapse Working knowledge of data warehousing and data mining Should be able to initiate, execute, monitor database admin related activities in coordination with different software vendors Managing regulatory defined specific data handling based on nature of data and security requirements, example: PII data Developing, managing, monitoring and tracking database Backups and Recovery Managing and administrating both on-premise and Cloud databases and/or integrations within them Devising, monitoring and tracking database and data access Security mechanism for companys business software databases Managing and controlling the Authentication and authorization of data from company databases Responsible for Capacity Planning for any requirements related to data Continuously monitoring database servers performance and security, finding opportunities for improving implementations and performance Troubleshooting any database related issues and solving if possible else engaging vendor to resolve the same Preferred candidate profile: Relevant Years of Experience : 5 - 8 yrs.
Posted 3 weeks ago
4.0 - 9.0 years
20 - 27 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Inviting applications for the role of Lead Consultant-Power BI Developer! Responsibilities: • Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices • Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools • Apply requisite business logic using data transformation and DAX • Understanding on Power BI Data Modelling and various in-built functions • Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway • Understanding of static and dynamic row level security • Ability to create wireframes based on user stories and Business requirement • Basic Understanding on ETL and Data Warehousing concepts • Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices
Posted 3 weeks ago
1.0 - 3.0 years
10 - 14 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 07 The Team S&P Global Mobility is seeking someone who is self-motivated, passionate about data and automobiles and a willingness to work with a geographically dispersed Operations team. The North American VIN team, part of the Global Data Operations (GDO) team, is responsible for research, analysis, and maintaining over 200 vehicle attributes in the North American market. Our team serves as the first source of truth for data, integrating into multiple S&P Global products. We value collaboration and maintain excellent relationships with OEMs worldwide, striving to provide actionable intelligence to our customers. Responsibilities and Impact As member of our team you will get an opportunity to work on various data tools and applications. The team member will be responsible for: Research, Process, and Maintain Vehicle Data Conduct thorough research and capture, validate & manage data related to over 200 vehicle attributes in the North American market. Data Transformation Run SQL queries to extract data and transform results into Excel files for analysis and coding into our tools and applications. Client Queries and Case Investigation Investigate client queries and Salesforce cases, identify discrepancies in existing coded data, and provide findings to vertical leads. Provide Actionable Intelligence Strive to deliver actionable insights to customers, enhancing the value of S&P Global products. Identify Process Improvements Identify process improvements within products and work to automate existing processes. Gen AI innovation: Working on Gen AI ideas and S&P Global internal Gen AI tool (Spark) to make the existing process more efficient. What Were Looking for Required Qualifications: Educational Background B.Tech (Mechanical with specialization in Automobile Engineering preferred) or any other similar Bachelor/Master degrees. Experience: 1-3 years of experience in data management. Technical Skills Strong knowledge of MS Excel, Word, and PowerPoint. Scripting and Database Knowledge Proficiency in scripting languages such as Python & SQL/PLSQL, along with a solid understanding of relational databases. Attention to Detail High attention to detail and accuracy in data management tasks. Course Knowledge Familiarity with Data Management & Operations, analytics and business intelligence. Problem-Solving Skills Strong analytical thought process and a drive for learning. Communication Skills Strong written and verbal communication skills. Team Collaboration Ability to work effectively with a geographically dispersed Operations team. Preferred Qualifications: SQL, Python and Gen AI expert Automobile sector background About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), DTMGOP203 - Entry Professional (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 3 weeks ago
10.0 - 15.0 years
25 - 40 Lacs
Bengaluru
Work from Office
Data Architect Req number: R5758 Employment type: Full time Worksite flexibility: Remote Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are seeking a highly skilled and experienced Data Architect with a strong background in Big Data technologies, Databricks solutioning, and SAP integration within the manufacturing industry. The ideal candidate will have a proven track record of leading data teams, architect scalable data platforms, and optimizing cloud infrastructure costs. This role requires deep hands-on expertise in Apache Spark, Python, SQL, and cloud platforms (Azure/AWS/GCP). This is a Full-time and Remote position. Job Description What You’ll Do Design and implement scalable, secure, and high-performance Big Data architectures using Databricks, Apache Spark, and cloud-native services. Lead the end-to-end data architecture lifecycle, from requirements gathering to deployment and optimization. Design repeatable and reusable data ingestion pipelines for bringing in data from ERP source systems like SAP, Salesforce, HR, Factory, Marketing systems etc. Collaborate with cross-functional teams to integrate SAP data sources into modern data platforms. Drive cloud cost optimization strategies and ensure efficient resource utilization. Provide technical leadership and mentorship to a team of data engineers and developers. Develop and enforce data governance, data quality, and security standards. Translate complex business requirements into technical solutions and data models. Stay current with emerging technologies and industry trends in data architecture and analytics. What You'll Need 6+ years of experience in Big Data architecture, Data Engineering and AI assisted BI solutions within Databricks and AWS technologies. 3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and others 3+ Years of Experience in building Delta Lakes and open formats using technologies like Databricks and AWS Analytics Services. Bachelor’s degree in computer science, information technology, data science, data analytics or related field Proven expertise in Databricks, Apache Spark, Delta Lake, and MLflow. Strong programming skills in Python, SQL, and PySpark. Experience with SAP data extraction and integration (e.g., SAP BW, S/4HANA, BODS). Hands-on experience with cloud platforms (Azure, AWS, or GCP), especially in cost optimization and data lakehouse architectures. Solid understanding of data modeling, ETL/ELT pipelines, and data warehousing. Demonstrated team leadership and project management capabilities. Excellent communication, problem solving and stakeholder management skills. Experience in the manufacturing domain, with knowledge of production, supply chain, and quality data. Certifications in Databricks, cloud platforms, or data architecture. Familiarity with CI/CD pipelines, DevOps practices, and infrastructure as code (e.g., Terraform). Physical Demands This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc. Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor. Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France