Jobs
Interviews

436 Data Modelling Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Dashboard Developer - Manager at Research Partnership in Gurgaon, India, you will be responsible for designing, building, and maintaining high-impact dashboards and data visualizations that translate raw market research data into actionable insights. Collaborating with researchers, analysts, and engineers, you will ensure seamless data flow and visual storytelling. Your primary role will involve developing and maintaining interactive dashboards using Power BI, Tableau, or similar BI tools. You will need to translate project requirements into intuitive visual stories, collaborate with scripting and data processing teams to streamline workflows, ensure data accuracy and security adherence, automate reporting processes, and stay updated on BI trends. For this role, you should have at least 6 years of hands-on experience in BI/dashboard development and a proven track record across the data to dashboard lifecycle. A background in healthcare or market research is preferred. Technical expertise required includes backend development skills in PHP (6+ years) with frameworks like Laravel or CodeIgniter, REST & SOAP API design, proficiency in databases like PostgreSQL, MySQL, MS SQL, and experience with Bigdata engines such as Google Big Query and AWS Athena. Frontend/visualization skills in HTML, CSS, JavaScript, React, Vue.js, jQuery, and visual libraries like Chart.js, D3.js, High Charts, Google Charts are necessary. Experience with cloud deployment (AWS & Google Cloud), containers (Docker, Vagrant, VirtualBox), CI/CD pipelines (Jenkins, CircleCI, GitHub Actions), caching technologies (Redis, Memcached), and security protocols is also essential. You should be familiar with data access control, role-based permissions, PHP unit testing, version control, and Agile collaboration tools. The ideal candidate for this role is a detail-oriented visual storyteller with a problem-solving mindset, strong communication skills, and a collaborative approach to work. Research Partnership offers a supportive environment with comprehensive training programs, international travel opportunities, and a relaxed working atmosphere. Inizio Advisory, of which Research Partnership is a part, is dedicated to providing strategic support to pharmaceutical and life science companies, helping them navigate the product and patient journey. The company values diversity, inclusivity, and authenticity in the workplace, encouraging candidates to apply even if they do not meet all qualifications.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

30 - 37 Lacs

Chennai

Remote

Role & responsibilities Expert in Azure Data Factory Proven experience in Data Modelling for Manufacturing data sources Proficient SQL design +5 years of experience in Data engineering roles Prove experience in PBI: Dashboarding, DAX calculations, Star scheme development and semantic model building Manufacturing knowledge Experience with GE PPA as data source is desirable API dev Knowledge Python skills Location: nearshore or offshore with 3 up to 5 hours overlap with CST time zone Preferred candidate profile

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Solution Design Business Analyst - Vice President at our organization, you will be at the forefront of driving strategic change initiatives related to regulatory deliverables across Risk, Finance, and Treasury. Your role will involve utilizing your expertise in business and data analysis to present complex data issues in a simplified and engaging manner. You will be responsible for front to back system designing, solving complex business problems, and employing skills such as data gathering, data cleansing, and data validation. Analyzing large volumes of data to identify patterns, potential data quality issues, and metrics analysis will be a key aspect of your role. You will also play a crucial role in translating business requirements into technical data requirements and collaborating with stakeholders to ensure that proposed solutions align with their needs and expectations. Additionally, you will be involved in creating operational and process designs to ensure the successful delivery of proposed solutions within the agreed scope. Supporting change management activities and developing traceability matrices to facilitate the implementation and integration of proposed solutions within the organization will also be part of your responsibilities. Ideal candidates for this role will have experience in the financial services industry, particularly within the banking sector in a Risk/Finance/Treasury role. Proficiency in data analysis tools such as SQL, Hypercube, Python, and data visualization/reporting tools like Tableau, Qlikview, Power BI, and Advanced Excel will be highly valued. Experience in data analysis, data modeling, and data architecture will also be advantageous. Your success in this role will be evaluated based on critical skills such as risk management, change and transformation, business acumen, strategic thinking, and digital and technology proficiency. This position is based in Pune and Chennai and entails working as an Individual Contributor. The primary purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and proposing solutions to address business problems and opportunities. Key responsibilities include identifying and analyzing business problems, developing business requirements, collaborating with stakeholders, creating business cases, conducting feasibility studies, and supporting change management activities. As a Vice President, you will be expected to contribute to setting strategies, driving requirements, and making recommendations for change. Additionally, you will be responsible for managing policies, resources, budgets, and delivering continuous improvements while ensuring adherence to policies and procedures. If you have leadership responsibilities, you are expected to demonstrate leadership behaviors that foster a thriving environment for colleagues to excel. Overall, your role as a Solution Design Business Analyst - Vice President will involve leveraging your expertise in data analysis, business requirements, and solution design to drive strategic initiatives and support the organization in achieving its goals. Your contributions will be instrumental in shaping the future success of the organization.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Analyst in the Solution Design team at Barclays, your primary responsibility will be to support in defining and designing technology and business solutions that align with organizational goals. This includes conducting requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs that cater to both business and technical needs. To excel in this role, you must have experience in delivering large-scale changes in complex environments, leading requirements documentation, and facilitating workshops to gather, clarify, and communicate business needs effectively. Your strong data analysis and data modeling skills will be crucial for performing data validations, anomaly detection, and deriving insights from large volumes of data to support decision-making. Proficiency in advanced SQL for querying, joining, and transforming data, along with experience in data visualization tools such as Tableau, Qlik, or Business Objects, is essential. Furthermore, you should be an effective communicator capable of translating complex technical concepts into clear language for diverse audiences. Your ability to liaise between business stakeholders and technical teams to ensure a mutual understanding of data interpretations, requirements definition, and solution designs will be key. Previous experience in Banking and Financial services, particularly in wholesale credit risk, as well as knowledge of implementing data governance standards, will be advantageous. Additional skills highly valued for this role include experience with Python data analysis and visualization tools, familiarity with external data vendors for integrating financials and third-party datasets, and exposure to wholesale credit risk IRB models and regulatory frameworks. Your responsibilities will include investigating and analyzing data quality issues, executing data cleansing and transformation tasks, designing and building data pipelines, applying advanced analytical techniques like machine learning and AI, and documenting data quality findings for improvement. It will also be essential to contribute to strategy, drive requirements, manage resources, and deliver continuous improvements in alignment with organizational goals. As a Senior Data Analyst, you will be expected to demonstrate leadership behaviors that create an environment for colleagues to thrive and deliver excellence. If the position includes leadership responsibilities, you will be required to set strategic direction, manage policies and processes, and drive continuous improvement. Additionally, you will advise key stakeholders, manage and mitigate risks, and collaborate with other areas of the organization to achieve business goals. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 2 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

The SF Data Cloud Architect plays a critical role within Salesforce's Professional Services team, assisting in pre-sales and leading the design and implementation of enterprise-grade Data Management solutions. As the SF Data Cloud Architect, you will be responsible for architecting scalable solutions across enterprise landscapes using Data Cloud. Your role involves ensuring that data is ready for enterprise AI, applying data governance guardrails, supporting enterprise analytics, and automation. This position covers the ANZ, ASEAN, and India markets. The ideal candidate for this role should bring deep expertise in data architecture, project lifecycle, and Salesforce ecosystem knowledge. Additionally, possessing strong soft skills, stakeholder engagement capabilities, and technical writing ability is essential. You will collaborate with cross-functional teams to shape the future of the customer's data ecosystem and enable data excellence at scale. Key Responsibilities: - Serve as a Salesforce Data Cloud Trusted Advisor, supporting and leading project delivery and customer engagements during the pre-sales cycle. Provide insights on how Data Cloud contributes to the success of AI projects. - Offer Architecture Support by providing Data and System Architecture guidance to Salesforce Account teams and Customers. This includes reviewing proposed architectures and peer-reviewing project effort estimates, scope, and delivery considerations. - Lead Project Delivery by working on cross-cloud projects and spearheading Data Cloud Design & Delivery. Collaborate with cross-functional teams from Developers to Executives. - Design and guide the customer's enterprise data architecture aligned with their business goals. Emphasize the importance of Data Ethics and Privacy by ensuring that customer solutions adhere to relevant regulations and best practices in data security and privacy. - Lead Data Cloud architecture enablement for key domains and cross-cloud teams. - Collaborate with analytics and AI teams to ensure data readiness for advanced analytics, reporting, and AI/ML initiatives. - Engage with stakeholders across multiple Salesforce teams and projects to deliver aligned and trusted data solutions. Influence Executive Customer stakeholders while aligning technology strategy with business value and ROI. Build strong relationships with internal and external teams to contribute to broader goals and growth. - Create and maintain high-quality architecture blueprints, design documents, standards, and technical guidelines. Technical Skills: - Over 15 years of experience in data architecture or consulting with expertise in solution design and project delivery. - Deep knowledge in MDM, Data Distribution, and Data Modelling concepts. - Expertise in data modelling with a strong understanding of metadata and lineage. - Experience in executing data strategies, landscape architecture assessments, and proof-of-concepts. - Excellent communication, stakeholder management, and presentation skills. - Strong technical writing and documentation abilities. - Basic understanding of Hadoop spark fundamentals is an advantage. - Understanding of Data Platforms such as Snowflake, DataBricks, AWS, GCP, MS Azure. - Experience with tools like Salesforce Data Cloud or similar enterprise Data platforms. Hands-on deep Data Cloud experience is a strong plus. - Working knowledge of enterprise data warehouse, data lake, and data hub concepts. - Strong understanding of Salesforce Products and functional domains like Technology, Finance, Telco, Manufacturing, and Retail is beneficial. Expected Qualifications: - Salesforce Certified Data Cloud Consultant - Highly Preferred. - Salesforce Data Architect - Preferred. - Salesforce Application Architect - Preferred. - AWS Spark/ DL, Az DataBricks, Fabric, Google Cloud, Snowflakes, or similar - Preferred.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

At EY, you have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of yourself. Your unique voice and perspective are essential to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. As a Senior Aera Developer, you will be a part of the Supply Chain Tech group of EY GDS consulting Team. Your role involves translating business needs into technical specifications, performing data analysis and manipulation, and simplifying business concepts through data modeling. You will be responsible for developing reporting systems, writing/customizing code in various Aera modules, and evaluating and improving Aera Skills. Additionally, you will generate quality reports, develop data visualizations, and work with clients throughout the implementation lifecycle. To succeed in this role, you must have experience as an Aera Skill Builder, expertise in BI reporting and data warehouse concepts, strong data modeling skills, and proficiency in Aera skill builder modules. You should be skilled in creating dynamic visualizations, configuring Aera skills, applying security concepts, and handling report performance and administration. Aera Skill Builder and Aera Architect certification is required. Ideal candidates will have a strong knowledge of Aera Skill Build concepts, expertise in data handling, experience in SQL tuning and optimization, and the ability to interact with customers to understand business requirements. Good communication skills, problem-solving abilities, and a proactive approach to learning new technologies are also important. In this role, you will drive Aera Skill Development tasks and have the opportunity to work with a market-leading, multi-disciplinary team. EY offers a supportive environment, coaching, and feedback from engaging colleagues, opportunities for skills development and career progression, and the freedom to handle your role in a way that suits you. EY is committed to building a better working world by creating long-term value for clients, people, and society, and by fostering trust in the capital markets. Through the expertise of diverse teams worldwide, EY provides trust, assurance, and support for clients to grow, transform, and operate effectively across various industries.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

Are you a skilled Data Architect with a passion for tackling intricate data challenges from various structured and unstructured sources Do you excel in crafting micro data lakes and spearheading data strategies at an enterprise level If this sounds like you, we are eager to learn more about your expertise. In this role, you will be responsible for designing and constructing tailored micro data lakes specifically catered to the lending domain. Your tasks will include defining and executing enterprise data strategies encompassing modeling, lineage, and governance. You will play a crucial role in architecting robust data pipelines for both batch and real-time data ingestion, as well as devising strategies for extracting, transforming, and storing data from diverse sources like APIs, PDFs, logs, and databases. Furthermore, you will be instrumental in establishing best practices related to data quality, metadata management, and data lifecycle control. Your hands-on involvement in implementing processes, strategies, and tools will be pivotal in creating innovative products. Collaboration with engineering and product teams to align data architecture with overarching business objectives will be a key aspect of your role. To excel in this position, you should bring to the table over 10 years of experience in data architecture and engineering. A deep understanding of both structured and unstructured data ecosystems is essential, along with practical experience in ETL, ELT, stream processing, querying, and data modeling. Proficiency in tools and languages such as Spark, Kafka, Airflow, SQL, Amundsen, Glue Catalog, and Python is a must. Additionally, expertise in cloud-native data platforms like AWS, Azure, or GCP is highly desirable, along with a solid foundation in data governance, privacy, and compliance standards. While exposure to the lending domain, ML pipelines, or AI integrations is considered advantageous, a background in fintech, lending, or regulatory data environments is also beneficial. This role offers you the chance to lead data-first transformation, develop products that drive AI adoption, and the autonomy to design, build, and scale modern data architecture. You will be part of a forward-thinking, collaborative, and tech-driven culture with access to cutting-edge tools and technologies in the data ecosystem. If you are ready to shape the future of data with us, we encourage you to apply for this exciting opportunity based in Chennai. Join us in redefining data architecture and driving innovation in the realm of structured and unstructured data sources.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Join us as a Data Engineer, VP with the leading MNC in Banking Domain. You will be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making. We will look to you to drive the build of effortless, digital-first customer experiences. If you are ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you are looking for. As a Data Engineer, you will simplify our organization by developing innovative data-driven solutions through data pipelines, modeling, and ETL design. You will inspire to be commercially successful while keeping our customers and the bank's data safe and secure. Your role will involve driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to gather and build data solutions. You will support our strategic direction by engaging with the data engineering community to deliver opportunities and carrying out complex data engineering tasks to build a scalable data architecture. Your responsibilities will include building advanced automation of data engineering pipelines through the removal of manual stages, embedding new data techniques into our business through role modeling, training, and experiment design oversight, delivering a clear understanding of data platform costs to meet your department's cost-saving and income targets, sourcing new data using the most appropriate tooling for the situation, and developing solutions for streaming data ingestion and transformations in line with our streaming strategy. To thrive in this role, you will need a strong understanding of data usage and dependencies and experience of extracting value and features from large-scale data. You will also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals. Additionally, you will need experience of ETL technical design, data quality testing, cleansing, and monitoring, data sourcing, and exploration and analysis, data warehousing, and data modeling capabilities, a good understanding of modern code development practices, experience of working in a governed and regulatory environment, and strong communication skills with the ability to proactively engage and manage a wide range of stakeholders.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As an Artificial Intelligence Specialist at Gen AI, you will be responsible for driving customer conversations, understanding customer requirements, creating Gen AI solution architectures, and developing customer proposals and RFP responses. You will guide solution engineers in creating Gen AI POCs and solutions for various industry verticals. Your role will involve staying updated on the latest technology developments, industry best practices, and incorporating them into Gen AI applications. Additionally, you will design and deploy Proof of Concepts (POCs) and Points of View (POVs) across different industry verticals to showcase the potential of Generative AI applications. To qualify for this role, you should have at least 8 years of experience in software development, with a minimum of 3 years of experience in Generative AI solution development. A bachelor's degree or higher in Computer Science, Software Engineering, or related fields is required. You should be adept at critical thinking, logical reasoning, and have a strong ability to learn new industry domains quickly. Being a team player who can deliver under pressure is essential. Furthermore, you should have experience with cloud technologies such as Azure, AWS, or GCP, as well as a good understanding of NVIDIA or similar technologies. A solid appreciation of AI/ML concepts and sound design principles is necessary for this role. In terms of required skills, you should be extremely dynamic and enthusiastic about technology. Development experience with languages like C++, Java, JavaScript, HTML, C#, Python, or node.js is preferred. You should be able to adapt quickly to new challenges and evolving technology stacks. Excellent written and verbal communication skills in English are essential, along with strong analytical and critical thinking abilities. A customer-focused attitude, initiative-taking, self-driven nature, and the ability to learn quickly are also important qualities for this role. Knowledge of Python, ML Algorithms, Statistics, source code maintenance, versioning tools, Object-Oriented Programming Concepts, debugging, and analytical skills is required. Preferred skills for this position include at least 5 years of experience in ML development and MLOps. Strong programming skills in Python, knowledge of ML, Data, and API libraries, and expertise in creating end-to-end data pipelines are advantageous. Experience with ML models, ModelOps/MLOps, AutoML, AI Ethics, Trust, Explainable AI, and popular ML frameworks like SparkML, TensorFlow, scikit-learn, XGBoost, H2O, etc., is beneficial. Familiarity with working in cloud environments (AWS, Azure, GCP) or containerized environments (Mesos, Kubernetes), interest in understanding functional and industry business challenges, and knowledge of IT industry and GenAI use cases in insurance processes are preferred. Expertise in Big Data and Data Modeling is also desirable for this role.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer specializing in Snowflake, you will leverage your 10+ years of experience to design, build, optimize, and maintain robust and scalable data solutions on the Snowflake platform. Your expertise in cloud data warehouse principles will be utilized to collaborate with stakeholders in translating business requirements into efficient data pipelines and models. Passionate about unlocking data-driven insights, you will work with a team to drive business value through Snowflake's capabilities. Key Skills: - Proficient in Snowflake architecture and features such as virtual warehouses, storage, data sharing, and data governance. - Advanced SQL knowledge for complex queries, stored procedures, and performance optimization in Snowflake. - Experience in ETL/ELT development using Snowflake tools, third-party ETL tools, and scripting languages. - Skilled in data modelling methodologies and performance tuning specific to Snowflake. - Deep understanding of Snowflake security features and data governance frameworks. - Proficient in scripting languages like Python for automation and integration. - Familiarity with cloud platforms like Azure and data analysis tools for visualization. - Experience in version control using Git and working in Agile methodologies. Responsibilities: - Collaborate with Data and ETL team to review, improve, and maintain data pipelines and models on Snowflake. - Optimize SQL queries for data extraction, transformation, and loading within Snowflake. - Ensure data quality, integrity, and security in the Snowflake environment. - Participate in code reviews and contribute to development standards. Education: - Bachelors degree in computer science, Data Science, Information Technology, or equivalent. - Relevant Snowflake certifications (e.g., Snowflake Certified Pro / Architecture / Advanced) are a plus. If you are a proactive Senior Data Engineer with a strong background in Snowflake, eager to drive business value through data-driven insights, this full-time opportunity in Pune awaits you. Join us at Arthur Grand Technologies Inc and be a part of our dynamic team.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Data Engineer at Srijan, a Material company, you will play a crucial role in designing and developing scalable data pipelines within Microsoft Fabric. Your primary responsibilities will include optimizing data pipelines, collaborating with cross-functional teams, and ensuring documentation and knowledge sharing. You will work closely with the Data Architecture team to implement scalable and governed data architectures within OneLake and Microsoft Fabric's unified compute and storage platform. Your expertise in Microsoft Fabric will be utilized to build robust pipelines using both batch and real-time processing techniques, integrating with Azure Data Factory for seamless data movement. Continuous monitoring, enhancement, and optimization of Fabric pipelines, notebooks, and lakehouse artifacts will be essential to ensure performance, reliability, and cost-efficiency. You will collaborate with analysts, BI developers, and data scientists to deliver high-quality datasets and enable self-service analytics via Power BI datasets connected to Fabric Lakehouses. Maintaining up-to-date documentation for all data pipelines, semantic models, and data products, as well as sharing knowledge of Fabric best practices with junior team members, will be an integral part of your role. Your expertise in SQL, data modeling, and cloud architecture design will be crucial in designing modern data platforms using Microsoft Fabric, OneLake, and Synapse. To excel in this role, you should have at least 7+ years of experience in the Azure ecosystem, with relevant experience in Microsoft Fabric, Data Engineering, and Data Pipelines components. Proficiency in Azure Data Factory, advanced data engineering skills, and strong collaboration and communication abilities are also required. Additionally, knowledge of Azure Databricks, Power BI integration, DevOps practices, and familiarity with OneLake, Delta Lake, and Lakehouse architecture will be advantageous. Join our awesome tribe at Srijan and leverage your expertise in Microsoft Fabric to build scalable solutions integrated with Business Intelligence layers, Azure Synapse, and other Microsoft data services.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. You must possess expert level knowledge and understanding of MS SQL Server and ETL (Extract, Transform, Load) processes. This includes developing data extraction, transformation, and loading processes using SQL Server, Stored Procedures, and SSIS. You should be well-versed in data modeling for dimensional/reporting, data warehousing, and transactional use cases. Performance optimizations on SQL Server, SQL tuning, and investigation of bottlenecks will be part of your responsibilities. Additionally, you should be able to translate architecture and low-level requirements into design and code using SQL and design logical and physical data models aligned with business use cases. Hands-on experience with SQL features like Indexes, Partitioning, Bulk loads, DB configuration, AlwaysOn, and Security/Roles is required. Scaling and optimizing performance through schema design, query tuning, and index creation are key aspects of the role. Knowledge of reporting tools such as SSRS and Tableau, as well as hands-on experience with Python scripting or similar skills, would be considered a plus. To be eligible for this position, you should have a minimum of 7 years of extensive experience in SQL Server and a deep understanding of SQL engine architecture and the Extraction-Transformation-Loading process (SSIS). Familiarity with SQL Management Studio, SQL Server Analysis Services, SQL Server Reporting Services, and Integration Service is required. A Bachelors degree or equivalent experience is necessary for this role. Please note that this job description offers a high-level overview of the responsibilities associated with the position. Other job-related duties may be assigned as needed. This position falls under the Technology job family group and specifically within the Applications Development job family. It is a full-time role that requires dedication and expertise in the mentioned technology skills. For further details on complementary skills or if you have any specific requirements, please refer to the information provided above or reach out to the recruiter for more information.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

At PwC, we focus on a variety of outsourced solutions and support clients across numerous functions. Our team of managed services professionals helps organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. We are skilled in project management, technology, and process optimization to deliver high-quality services to our clients. If you join our managed service management and strategy team at PwC, you will be involved in transitioning and running services, managing delivery teams, programmes, commercials, performance, and delivery risk. Your work will revolve around the process of continuous improvement and optimizing the managed services process, tools, and services. Position Requirements: **Required Skills:** - Minimum 2 years of relevant experience - Experience in modelling BW data flow using ADSOs and Transformations - Good knowledge in ECC extraction, data modelling, BW-ABAP, and Bex - Hands-on experience on ABAP in BI in writing ABAP code in Datasource Enhancements, Function module extractors, Routines, Bex exits - Good knowledge in Bex and experience on various options in Bex query designer, like CKF, RKF, and cell definitions - Experience with ECC table data model and analysis - Experience with SAP HANA Modelling using Calculation view - Experience with SAP HANA Programming including SQL, SQL script, and CE Script - Experience in handling integration between multiple systems for reporting SAP ECC, SAP BW, BOBJ integration - Experience in using mixed scenarios with SAP BW on HANA like Composite provider and HANA Models - Experience in new dimension reporting tools like Webi, Dashboards 4.1, Analysis office, BO Explorer - Good to have knowledge on SAC - Should have good written and oral communication skills - Must be a good team player **Preferred Skills:** - Experience with ETL using SAP BOBJ Data services 4.0 - Good knowledge in working with different source system extractions - Functional knowledge or Familiarity on the basic business processes with the following SAP Functional Areas: SAP FI/CO, SAP MM, SAP SD, SAP HR **Job Summary:** A career in our Managed Services team offers you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management, and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Join us to help design, build, and operate the next generation of software and services that manage interactions across all aspects of the value chain. **Minimum Degree Required:** Bachelor's degree **Preferred Field(s) Of Study:** Computer and Information Science, Management Information Systems **Minimum Year(s) of Experience:** 1 year of experience **Certification(s) Preferred:** More than 2 years of hands-on experience in SAP Datasphere / DWC with at least 1 full life cycle project implementation. Skills in developing/maintaining DWC Models, CDS Views, SQL Scripts, and SAC Stories. Experience in building complex models in SAP Datasphere/DWC. Ability to design, build data flows, and develop chains to load and monitor Data Loading. Knowledge in setting up connections to Datasphere and from Datasphere. Unit testing dataflows and reconciling data to Source Systems. Exposure in troubleshooting data issues and providing workarounds. Proficiency in Datasphere security setup, currency conversion. Writing CDS Analytical Queries and S4HANA Embedded Analytics. Performance tuning of models in the datasphere. Knowledge of Datasphere and Data Lake integration. Using the Database explorer and SAP Hana Cockpit through Datasphere. **Nice To Have:** Good knowledge in either BW Modeling or HANA Modeling. BW/4HANA and/or Native HANA (or HANA Cloud) modeling, including SQL Scripting, Graphical View-Modelling, SDA extraction.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

At Goldman Sachs, our Engineers are dedicated to making the impossible possible. We are committed to changing the world by bridging the gap between people and capital with innovative ideas. Our mission is to tackle the most complex engineering challenges for our clients, crafting massively scalable software and systems, designing low latency infrastructure solutions, proactively safeguarding against cyber threats, and harnessing the power of machine learning in conjunction with financial engineering to transform data into actionable insights. Join our engineering teams to pioneer new businesses, revolutionize finance, and seize opportunities in the fast-paced world of global markets. Engineering at Goldman Sachs, consisting of our Technology Division and global strategists groups, stands at the heart of our business. Our dynamic environment demands creative thinking and prompt, practical solutions. If you are eager to explore the limits of digital possibilities, your journey starts here. Goldman Sachs Engineers embody innovation and problem-solving skills, developing solutions in various domains such as risk management, big data, and mobile technology. We seek imaginative collaborators who can adapt to change and thrive in a high-energy, global setting. The Data Engineering group at Goldman Sachs plays a pivotal role across all aspects of our business. Focused on offering a platform, processes, and governance to ensure the availability of clean, organized, and impactful data, Data Engineering aims to scale, streamline, and empower our core businesses. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will oversee observability, cost, and capacity, with operational responsibility for some of our largest data platforms. We are actively involved in the entire lifecycle of platforms, from design to decommissioning, employing an SRE strategy tailored to this lifecycle. We are looking for individuals who have a development background and are proficient in code. Candidates should prioritize Reliability, Observability, Capacity Management, DevOps, and SDLC (Software Development Lifecycle). As a self-driven leader, you should be comfortable tackling problems with varying degrees of complexity and translating them into data-driven outcomes. You should be actively engaged in strategy development, participate in team activities, conduct Postmortems, and possess a problem-solving mindset. Your responsibilities as a Site Reliability Engineer (SRE) will include driving the adoption of cloud technology for data processing and warehousing, formulating SRE strategies for major platforms like Lakehouse and Data Lake, collaborating with data consumers and producers to align reliability and cost objectives, and devising strategies with data using relevant technologies such as Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, and Gitlab. Basic qualifications for this role include a Bachelor's or Master's degree in a computational field, 1-4+ years of relevant work experience in a team-oriented environment, at least 1-2 years of hands-on developer experience, familiarity with DevOps and SRE principles, experience with cloud infrastructure (AWS, Azure, or GCP), a proven track record in driving data-oriented strategies, and a deep understanding of data multi-dimensionality, curation, and quality. Preferred qualifications entail familiarity with Data Lake / Lakehouse technologies, experience with cloud databases like Snowflake and Big Query, understanding of data modeling concepts, working knowledge of open-source tools such as AWS Lambda and Prometheus, and proficiency in coding with Java or Python. Strong analytical skills, excellent communication abilities, a commercial mindset, and a proactive approach to problem-solving are essential traits for success in this role.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

haryana

On-site

You are a Technical Consultant with 10 years of experience in Oracle or Informatica, and knowledge of Fund Accounting, Fund Reporting, and Derivative would be beneficial for this role. Your responsibilities include developing architecture design principles, reviewing and shaping technical solutions, defining governing principles for platform solutions, and being the sign-off authority on technical solutions and artifacts. You will also need to make practical trade-offs in design, engage with 3rd party vendors, drive process improvements like Continuous Integration Tools, and provide input on security design requirements and performance tuning. Your essential skills should include a B.E./B.Tech. or M.C.A. in Computer Science with at least 10 years of industry experience, technical leadership in Informatica, Oracle, Unix Scripting, Perl, Scheduling tools, strong knowledge of Database Design, Data Warehouse, Data Mart, Enterprise reporting, and ODS concepts, and very strong Oracle PLSQL/T-SQL experience. You should be able to guide juniors, propose solutions, estimate new projects, work with various teams to design solutions, and develop and implement standards and best practices for data management. Your personal characteristics should include enthusiasm for agile development, a focus on doing things correctly, innovation in technology usage, excellent communication and presentation skills, ability to work well in a matrix-based organization, eagerness to learn and apply new skills, ability to interact with business users, prioritize activities, and work effectively in a team. Motivation, flexibility, commitment, discipline, and a desire to learn and grow are essential traits for this role. If you possess the required skills and experience in banking/financial industry, specifically in Oracle, Informatica, Solution Architecture, Technical Architecture, DWH, Fund Accounting, Fund Reporting, and Derivative concepts, and have the educational background of B.Sc/B.Com/M.Sc/MCA/B.E/B.Tech, you are encouraged to apply for this Technical Consultant position in Gurgaon.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As part of our GDS Consulting team, you will be part of NCLC team delivering specific to Microsoft account. You will be working on latest Microsoft BI technologies and will collaborate with other teams within Consulting services. The opportunity We're looking for resources with expertise in Microsoft BI, Power BI, Azure Data Factory, Data Bricks to join the group of our Data Insights team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of our service offering. Your Key Responsibilities Responsible for managing multiple client engagements. Understand and analyse business requirements by working with various stakeholders and create the appropriate information architecture, taxonomy and solution approach. Work independently to gather requirements, cleansing extraction and loading of data. Translate business and analyst requirements into technical code. Create interactive and insightful dashboards and reports using Power BI, connecting to various data sources and implementing DAX calculations. Design and build complete ETL/Azure Data Factory processes moving and transforming data for ODS, Staging, and Data Warehousing. Design and development of solutions in Data Bricks, Scala, Spark, SQL to process and analyze large datasets, perform data transformations, and build data models. Design SQL Schema, Database Schema, Stored procedures, function, and T-SQL queries. Skills And Attributes For Success Collaborating with other members of the engagement team to plan the engagement and develop work program timelines, risk assessments and other documents/templates. Able to manage Senior stakeholders. Experience in leading teams to execute high quality deliverables within stipulated timeline. Skills in PowerBI, Azure Data Factory, Databricks, Azure Synapse, Data Modelling, DAX, Power Query, Microsoft Fabric. Strong proficiency in Power BI, including data modelling, DAX, and creating interactive visualizations. Solid experience with Azure Databricks, including working with Spark, PySpark (or Scala), and optimizing big data processing. Good understanding of various Azure services relevant to data engineering, such as Azure Blob Storage, ADLS Gen2, Azure SQL Database/Synapse Analytics. Strong SQL Skills and experience with one of the following: Oracle, SQL, Azure SQL. Good to have experience in SSAS or Azure SSAS and Agile Project Management. Basic Knowledge on Azure Machine Learning services. Excellent Written and Communication Skills and ability to deliver technical demonstrations. Quick learner with a can-do attitude. Demonstrating and applying strong project management skills, inspiring teamwork and responsibility with engagement team members. To qualify for the role, you must have A bachelor's or master's degree. A minimum of 4-7 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you'll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. What Working At EY Offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The Lead Data Modeler role involves developing high performance, scalable enterprise data models on a cloud platform. You must possess strong SQL skills along with excellent data modeling expertise, and be well-versed in the Kimball methodology. Your responsibilities include participating in various activities throughout the systems development lifecycle, supporting activities, engaging in POCs, and presenting outcomes effectively. Additionally, you will be responsible for analyzing, architecting, designing, programming, debugging both existing and new products, as well as mentoring team members. It is crucial to take ownership and demonstrate high professional and technical ethics with a consistent focus on emerging technologies beneficial for the organization. You should have over 10 years of work experience in data modeling or engineering. Your duties will involve defining, designing, and implementing enterprise data models, building Kimball-compliant data models in the Analytic layer of the data warehouse, and constructing 3rd normal form-compliant data models in the hub layer of the data warehouse. You must translate tactical/strategic requirements into effective solutions that align with business needs. The role also requires participation in complex initiatives, seeking help when necessary, reviewing specifications, coaching team members, and researching coding standards improvements. Technical skills include hands-on experience in SQL, query optimization, RDBMS, Data Warehouse (ER and Dimensional modeling), modeling data into star schemas using the Kimball methodology, Agile methodology, CICD frameworks, DevOps practices, and working in an onsite-offshore model. Soft skills such as leadership, analytical thinking, problem-solving, communication, and presentation skills are essential. You should be able to work with a diverse team, make decisions, guide team members through complex problems, and effectively communicate with leadership and business teams. A Bachelor's degree in Computer Science, Information Systems, or a related technical area is required, preferably B.E in Computer Science/Information Tech. Nice-to-have skills include experience with Apache Spark Python, graph databases, data identification, ingestion, transformation, and consumption, data visualization, SAP Enterprise S/4 HANA familiarity, programming language skills (Python, NodeJs, Unix Scripting), and experience in GCP Cloud Ecosystem. Experience in software engineering across all deliverables, including defining, architecting, building, testing, and deploying, is preferred. The Lead Data Modeler role does not offer relocation assistance and does not specify a particular work shift.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The company is seeking highly motivated and passionate individuals to join the team with tremendous growth potential. If you are a motivated individual with a minimum of 2 years of experience in database systems design, this opportunity might be for you. Your primary responsibilities will include preparing designs for database systems, recommending improvements for performance, developing physical and logical data models, designing ETL processes, creating data models, and performing tests on data. To excel in this role, you should have experience in data modeling with enterprise applications and a good understanding of user requirements, relational databases, JSON data models, data warehouses, star schema, and basic fact table and dimension table techniques. Additionally, hands-on skills in ETL processes, effective troubleshooting, and handling high-volume data loading processes are essential. If you possess good analytical, planning, and implementation skills along with expertise in Ms.SQL Server (SSIS, SSMS, SSAS, SSRS), we encourage you to apply for this position. Please submit your updated resume in MS WORD format to hr.snss@southnests.com. All personal information collected from unsuccessful applicants will be retained for one year for potential future opportunities and then removed. If you wish to withdraw your consent before the specified time-frame, please contact hr.snss@southnests.com. Location: India, Chennai.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an Operations Analyst at Lam Research, you will have the opportunity to play a crucial role in enhancing the operational effectiveness and efficiency of the Global Operations team. Your primary responsibility will be to utilize analytical methodologies to guide decision-makers towards achieving operational excellence. Your contributions will be instrumental in driving improvements and optimizing processes within the organization. Your main responsibilities will include developing, automating, and maintaining comprehensive reports and dashboards using tools such as Excel and Power BI. You will analyze datasets to provide valuable insights and create visualizations that effectively communicate data stories. It will be essential to ensure compliance with analytical standards and data governance policies to uphold data integrity and accuracy. Additionally, you will be expected to challenge stakeholders to prioritize long-term, data-driven decisions over quick fixes and identify process gaps, offering data-driven recommendations to leadership. You will also play a key role in facilitating change management for data and process changes, ensuring smooth implementation and seamless rollout. Monitoring and publishing operational performance against established metrics and targets will be crucial to track progress and make informed decisions. The ideal candidate for this role will hold a Bachelor's degree in business administration, operations management, supply chain, project management, finance, engineering, or a related field. You should have a minimum of 5+ years of experience in operations, with a focus on extracting and analyzing operational data to derive meaningful insights. Proficiency in data analysis tools and software, particularly Excel and Power BI, is required. Strong self-learning ability, excellent written and verbal communication skills, effective task management, and innovative problem-solving skills are essential qualities for this position. Preferred qualifications include experience with Alteryx for data preparation, modelling, and advanced analytics, as well as expertise in analyzing and optimizing complex operational processes. Demonstrated experience in process mapping, workflow analysis, root cause analysis, and corrective action planning will be advantageous. At Lam Research, we are committed to creating an inclusive environment where every individual is valued, included, and empowered to achieve their full potential. Our work location models offer flexibility based on role requirements, with hybrid roles combining on-site collaboration with remote work options to support a balanced approach to work-life integration.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

faridabad, haryana

On-site

As a Data Governance Specialist, you will be responsible for developing and executing data governance strategies and roadmaps to ensure the integrity, accuracy, and efficiency of master data across the organization. This includes leading the implementation and enhancement of SAP MDG solutions, such as data modeling, data stewardship, and workflow management. Designing and enforcing data governance policies, procedures, and standards to maintain data quality and consistency will be a crucial part of your role. You will collaborate closely with cross-functional teams, including business stakeholders, IT teams, and external vendors, to gather requirements, design solutions, and ensure successful project delivery. Managing the integration of SAP MDG with other SAP modules and third-party applications is essential to ensure seamless data flow and consistency. Additionally, you will implement and manage data quality checks, validations, and cleansing processes to uphold high standards of data accuracy and reliability. Facilitating change management processes, including training and support for end-users, is a key aspect of this role to ensure effective adoption of MDG solutions. You will also be responsible for identifying opportunities for process improvements and automation in master data management practices and recommending enhancements to existing systems and processes. Providing advanced support for troubleshooting and resolving complex issues related to SAP MDG and master data management will be part of your responsibilities to ensure smooth operations and functionality.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are a skilled Data Engineer with a strong background in Python, Snowflake, and AWS. Your primary responsibility will involve constructing and refining scalable data pipelines, integrating various data sources, and supporting analytics and business intelligence solutions within a cloud-based setting. An essential aspect of your role will entail the design and supervision of AWS Glue Jobs to facilitate efficient, serverless ETL workflows. Your key duties will revolve around designing and executing robust data pipelines using AWS Glue, Lambda, and Python. You will extensively collaborate with Snowflake for data warehousing, modeling, and analytics assistance. Managing ETL/ELT jobs using AWS Glue to ensure consistent data reliability will be a crucial part of your responsibilities. Furthermore, you will be tasked with migrating data between CRM systems, particularly from Snowflake to Salesforce, adhering to defined business protocols and ensuring data precision. It will also be your responsibility to optimize SQL/SOQL queries, manage large data volumes, and sustain high-performance levels. Additionally, implementing data normalization and quality checks will be essential to guarantee accurate, consistent, and deduplicated records. Your required skills include strong proficiency in Python, hands-on experience with Snowflake Data Warehouse, and familiarity with AWS services such as Glue, S3, Lambda, Redshift, and CloudWatch. You should have experience in ETL/ELT pipelines and data integration using AWS Glue Jobs, along with expertise in SQL and SOQL for data extraction and transformation. Moreover, an understanding of data modeling, normalization, and performance optimization is essential for this role. It would be advantageous if you have experience with Salesforce Data Loader, ETL mapping, and metadata-driven migration, as well as exposure to CI/CD tools, DevOps, and version control systems like Git. Previous work experience in Agile/Scrum environments will also be beneficial for this position.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

bhubaneswar

On-site

The Apache Superset Data Engineer plays a key role in designing, developing, and maintaining scalable data pipelines and analytics infrastructure, with a primary emphasis on data visualization and dashboarding using Apache Superset. This role sits at the intersection of data engineering and business intelligence, enabling stakeholders to access accurate, actionable insights through intuitive dashboards and reports. Core Responsibilities: - Create, customize, and maintain interactive dashboards in Apache Superset to support KPIs, experimentation, and business insights. - Work closely with analysts, BI teams, and business users to gather requirements and deliver effective Superset-based visualizations. - Perform data validation, feature engineering, and exploratory data analysis to ensure data accuracy and integrity. - Analyze A/B test results and deliver insights that inform business strategies. - Establish and maintain standards for statistical testing, data validation, and analytical workflows. - Integrate Superset with various database systems (e.g., MySQL, PostgreSQL) and manage associated drivers and connections. - Ensure Superset deployments are secure, scalable, and high-performing. - Clearly communicate findings and recommendations to both technical and non-technical stakeholders. Required Skills: - Proven expertise in building dashboards and visualizations using Apache Superset. - Strong command of SQL and experience working with relational databases like MySQL or PostgreSQL. - Proficiency in Python (or Java) for data manipulation and workflow automation. - Solid understanding of data modeling, ETL/ELT pipelines, and data warehousing principles. - Excellent problem-solving skills and a keen eye for data quality and detail. - Strong communication skills, with the ability to simplify complex technical concepts for non-technical audiences. - Nice to have familiarity with cloud platforms (AWS, ECS). Qualifications: - Bachelors degree in Computer Science, Engineering, or a related field. - 3+ years of relevant experience.,

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 12 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Process Mining - Data Engineering Consulting Practitioner Find endless opportunities to solve our clients' toughestchallenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: Operations & Process Transformation, Function: Supply Chain and Operations, Business Unit: Strategy & Consulting, Global Network I Areas of Work: Process Mining | Level: Associate / Analyst / Specialist | Location: Gurugram, Mumbai, Pune, Bengaluru, Chennai, Hyderabad, Kolkata | Overall Relevant Exp: 4-10 years+ Explore an Exciting Career at Accenture Are you an outcome-oriented problem solverDo you enjoy working on transformation strategies for global clientsDoes working in an inclusive and collaborative environment spark your interest Then, is the right place for you to explore limitless possibilities. As a part of our practice, you will help organizations reimagine and transform their supply chains for tomorrowwith a positive impact on the business, society and the planet. Together, lets innovate, build competitive advantage, and improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Be the process architect to lead process discovery and whiteboarding sessions with business stakeholders. Deliver process discovery or improvement projects using process mining tools. Work on process mining market leaders like Celonis, Signavio , Power automate Process Mining, and so on. Develop business requirements for the implementation of technology solutions for the client. Demonstrate in-depth knowledge of industry trends , SAP transformation journey , new technologies, and tools. Aid in asset, accelerator, use case creation and enhancement Contribute to business development initiatives and display ability to solve complex business problems Bring your best skills forward to excel in the role: Strong analytical skills to reach clear-cut, methodical solutions Ability to solve complex business problems and deliver client delight Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Strong team-management skills Your experience counts! MBA from Tier 1 B-school 4+ years of experience with understanding of process mining Hands-on experience of identifying value opportunities using any Process Mining tool, such as Celonis/Signavio and so on Certified expertise as functional value architect for process discovery and mining tools like Celonis, Signavio, Power automate process mining Conceptual understanding of as-is processes in supply chain and ability to design to-be process Good understanding/experience of process mining in SAP transformations or if you have supported mining/process design/journey definition initiatives in SAP projects Experience with automation solutions will be a plus Knowledge of data collection approach, data cleansing, data modelling, process discovery, process analysis and insights Strong communication skills, especially around breaking down complex structures into digestible and relevant points for a diverse set of clients and colleagues, at all levels.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

vadodara, gujarat

On-site

The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

Embark on your transformative journey as a Solution Design Business Analyst - Vice President. You will be responsible for driving key strategic change initiatives for regulatory deliverables across Risk, Finance, and Treasury. To excel in this role, you should have at least 10 years of experience in business/data analysis, enabling you to present complex data issues in a simple and engaging manner. Your expertise should extend to front to back system designing, complex business problem solutioning, data gathering, data cleansing, and data validation. You will be expected to analyze large volumes of data, identify patterns, address data quality issues, conduct metrics analysis, and translate your analysis into valuable insights. Additionally, you will play a crucial role in capturing business requirements and translating them into technical data requirements. Collaboration with stakeholders to ensure proposed solutions meet their needs and expectations is a key aspect of this role. You will also be involved in creating operational and process designs to ensure the successful delivery of proposed solutions within the agreed scope, as well as supporting change management activities. Experience within the financial services industry, particularly in the banking sector within a Risk/Finance/Treasury role, will be highly valued. Proficiency in data analysis tools such as SQL, Hypercube, Python, and data visualization/reporting tools like Tableau, Qlikview, Power BI, and Advanced Excel will be beneficial. Familiarity with data modeling and data architecture is also desirable. The primary purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and proposing solutions to address business problems and opportunities. Key Accountabilities include identifying and analyzing business problems and client requirements necessitating change within the organization, developing business requirements to address these challenges, collaborating with stakeholders to ensure proposed solutions align with their needs, creating business cases justifying investment in solutions, conducting feasibility studies to assess proposed solutions" viability, reporting on project progress to ensure timely and budget-compliant delivery, and supporting change management activities. As a Vice President, you are expected to contribute to strategic planning, resource allocation, policy management, continuous improvement initiatives, and policy enforcement. Your leadership responsibilities may involve demonstrating a set of leadership behaviors focusing on creating an environment for colleagues to excel. For individual contributors, being a subject matter expert within your discipline, guiding technical direction, leading collaborative assignments, and coaching team members are essential. You will also provide guidance on functional and cross-functional areas of impact and alignment, risk management, and organizational strategies. Demonstrating a comprehensive understanding of the organization's functions, collaborating with various work areas, creating solutions based on analytical thought, building trusting relationships with stakeholders, and upholding Barclays Values and Mindset are crucial aspects of this role.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies