Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
3 - 6 Lacs
Bengaluru
On-site
JOB DESCRIPTION KPMG in India, a professional services firm, is the Indian member firm affiliated with KPMG International and was established in September 1993. Our professionals leverage the global network of firms, providing detailed knowledge of local laws, regulations, markets, and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, and Vadodara. KPMG in India offers services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused, and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment KPMG Advisory professionals provide advice and assistance to enable companies, intermediaries, and public sector bodies to mitigate risk, improve performance, and create value. KPMG firms provide a wide range of Risk Advisory and Financial Advisory Services that can help clients respond to immediate needs as well as put in place the strategies for the longer term. Projects in IT Advisory focus on the assessment and/or evaluation of IT systems and the mitigation of IT-related business risks. They are either IS audit, SOX reviews, Internal audit engagements, IT infrastructure review and/or risk advisory including but not limited to IT audit supports in nature. QUALIFICATIONS • IT Audit + SAP experience with knowledge of IT governance practices Prior IT Audit knowledge in areas of ITGC, ITAC (application/automated controls) SOX 404, SOC-1 and SOC-2 Audits Good to have knowledge of other IT regulations, standards and benchmarks used by the IT industry (e.g. NIST, PCI-DSS, ITIL, OWASP, SOX, COBIT, SSAE18/ISAE 3402 etc.) Technical Knowledge of IT Audit Tools with excellent knowledge of IT Audit process and methodology Exposure to Risk Management and Governance Frameworks/ Systems will be an added advantage Exposure to ERP systems will be added advantage Strong project management, communication (written and verbal) and presentation skills Knowledge of security measures and auditing practices within various applications, operating systems, and databases. Strong self-directed work habits, exhibiting initiative, drive, creativity, maturity, self-assurance, and professionalism Preferred Certifications – CISA/CISSP//CISM Exposure to automation Data Analytics tools such as QlikView/Qlik sense, ACL, Power BI will be an advantage Proficiency with Microsoft Word, Excel, Visio, and other MS Office tools Equal employment opportunity information: KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Posted 1 month ago
5.0 years
0 Lacs
Chennai
On-site
At Ford Motor Credit Company, we are going support indirect lending for Ford Credit Bank through existing lending platforms and integrating new Ford Credit Bank data into Enterprise Data Warehouse (GCP BQ) for data insights and analytics. This role is for ETL/Data Engineer who can integrate Ford Credit Bank data from existing North America Lending platforms into Enterprise Data Warehouse (GCP BQ), To enable critical regulatory reporting, operational analytics, and risk analytics. You will be responsible for deep-dive analysis of Current State Receivables and Originations data in a Data warehouse, as well as impact analysis related to Ford Credit Bank and providing solutions for implementation. You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications, integrating it into analytical domains, and building data marts & products in GCP. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform, Mainframe, and IBM DataStage. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right data warehouse solutions. Successfully designed and implemented data warehouses and ETL processes for over 5 years, delivering high-quality data solutions. Exposure to Fiserv banking solution is desired [VN1] [VG2] 8+ years of complex BigQuery SQL development experience, Mainframe (JCL, COBOL),gszutil, and DataStage job development. Experienced with Mainframe, Datastage, Autosys Experienced with Mainframe file formats, COBOL copybooks, ORC Formats, JCL scripts, and related technologies to manage legacy data ingestion. Design, develop, and maintain ETL processes using IBM DataStage to extract, transform, and load data from Mainframe systems and other sources such as SQL, Oracle, Postgres, AS400, MF DB2 into the data warehouse. [VN3] [VG4] Develop and modify batch scripts and workflows using Autosys to schedule and automate ETL jobs. Experienced cloud engineer with 5+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Cloud Build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Desired: Professional Certification in GCP (e.g., Professional Data Engineer) Master’s degree in computer science, software engineering, information systems, Data Engineering, or a related field. Data engineering or development experience gained in a regulated financial environment. Experience with Teradata to GCP migrations is a plus. Develop and modify existing data pipelines on Mainframe (JCL, COBOL), IBM DataStage and BigQuery to integrate Ford Credit Bank data into Enterprise Data Warehouse (GCP BQ) and support production deployment. Use APIs for data processing, as required Implement architecture provided by data architecture team. Will be using Fiserv bank features and mainframe data sets for enabling banks data strategy. Be proactive and implement design plans. Will be using DB2 for performing bank integrations. Prepare test plan and execution within EDW/Data Factory (end-to-end, from ingestion, to integration, to marts) to support use cases. Design and build production data engineering solutions to deliver reusable patterns using Mainframe JCL, Datastage, autosys Design and build production data engineering solutions to deliver reusable patterns using Google Cloud Platform (GCP) services: Big Query, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud build and App Engine, and real-time data streaming platforms like Apache Kafka, GCP Pub/Sub and Qlik Replicate Collaborate with stakeholders and cross-functional teams to gather and define data requirements, ensuring alignment with business objectives. Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage. Perform necessary data mapping, impact analysis for changes, root cause analysis, data lineage activities, and document information flows. Develop and maintain documentation for data engineering processes, standards, and best practices, ensuring knowledge transfer and ease of system maintenance. Implement an enterprise data governance model and actively promote the concept of data - protection, sharing, reuse, quality, and standards to ensure the integrity and confidentiality of data. Work in an agile product team to deliver code frequently using Test Driven Development (TDD), continuous integration, and continuous deployment (CI/CD). Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Continuously enhance your FMCC domain knowledge, stay current on the latest data engineering practices, and contribute to the company's technical direction while maintaining a customer-centric approach.
Posted 1 month ago
0 years
0 Lacs
Tiruchchirāppalli
On-site
Job Description Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. Job Description - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 1 month ago
0 years
4 - 9 Lacs
Calcutta
On-site
Job description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Lead Assistant Vice President , Risk and Compliance AI and Analytics Principal responsibilities The individual will be responsible for reporting RC AI & Analytics scorecard and key performance indicators in a timely and accurate manner. Promote a culture of data driven decision making, aligning short term decisions and investments with longer term vision and objectives. Help the business to manage regulatory risk in a more effective, efficient, and commercial way through the adoption of data science (AI/ML and advanced analytics) Support communication and engagement with stakeholders and partners to increase understanding and adoption of data science products and services also research opportunities. Collaborate with other analytics teams across the banks to share insight and best practice. Foster a collaborative, open and agile delivery culture. Build positive momentum for change across the organization with the active support and buy-in of all stakeholders. The ability to communicate often complex analytical solutions to the wider department, ensuring a strong transfer of key findings & intelligence. Requirements University degree in technology, data analytics or related discipline or relevant work experience in computer or Data Science Understanding of Regulatory Compliance, risks and direct experience of deployment of controls and analytics to manage those risks. Experience in Financial Services (experience within a tier one bank) or related industry, Knowledge of the HSBC Group structure, its business and personnel, and HSBC’s corporate culture Experience of agile development and active contribution to strategy and innovation. Solid understanding of applied mathematics, statistics, data science principles and advanced computing (machine learning, modelling, NLP and Generative AI) Experience working within the Hadoop ecosystem in addition to strong technical skills in analytical languages such as Python and SQL. Specific knowledge of GCP, AWS, Azure, Spark and/or graph theory an advantage. Experience of visualization tools and techniques including Qlik and Tableau Solid understanding of data & architecture concepts including cloud, ETL, Ontology, Data Modelling. Experience of using JIRA, GIT, Confluence, Teams, Slack, Advanced Excel You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. ***Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Product Development In this role, you will: You are responsible for both delivery and implementation of standards and best practice across FCDP. You will sit across all of the respective VS FCDP pods and ensure that the respective technical leads are implementing best practice and ways of working. You’ll also take responsibility for technical design and act as an interface between the build and architecture community. Have a comprehensive working knowledge of the Finance Data Platform and provided capabilities Have a comprehensive knowledge of ET and bank technical/architectural strategy Maintain a close working relationship with the Finance Data VS delivery technical lead, to share the ET and bank technical/architectural strategy Proactively communicates and educates technology colleagues in terms of higher level technology/architecture strategy. Active contributor to external forums to update colleagues on technical initiatives being undertaken within FCDP Serve as an escalation point for complex technical queries coming from the Finance VS pods Responsible for maintaining an ongoing relationship with the demand requestors - customers / stakeholders In collaboration with stakeholders, FCD team and VS IT counterparts responsible for ensuring there is a maintained and prioritized backlog of demand for delivery of adequate quality Where required raises and escalates issues with the quality, timing or other issues relating to the requirements raised that may impact the effectiveness of delivery Setup and maintain a suitable repository(s) for control of project artefacts Accountable for the maintenance of project artefacts, ensuring they are reflective of the project at all times Proactive communication of project progress, ensuring stakeholders are suitably informed Engages and supports the VS delivery pod, holding them to account for committed deliver, motivating and supporting through the removal of roadblocks as required Requirements To be successful in this role you should meet the following requirements: 5+ years of experience in working in Agile delivery Demonstrable experience of managing projects including the management of engineering teams, project plans, budgets, risks and issues Good communication and negotiation skills Detailed understanding of FCDP architecture and technical design Build relationships with technology, change delivery and other stakeholders Knowledge of overall financial services industry with specific functional experience in Finance Working knowledge of data delivery covering ingestion, curation and egress Awareness of Data Management considerations such as Data Controls, Data Compliance and privacy Exposure to Google Cloud, ETL, Ingress, Egress tooling and Business Intelligence tooling such as Tableau, Qlik Sense will be added advantage Familiarity IT Systems in banking landscape, particularly Finance would be an advantage Prior experience working within HSBC is desirable Highly proactive and able to work independently Excellent interpersonal communication skills to discuss technical, functional requirement and coordinating on various deliverables with senior business, change team, Operations stake holders Ability to present complex ideas succinctly, both verbally and in written form Taking ownership of the deliverables and achieving desired outcome under agreed timescales Motivated individual that likes to keep current and develop personally Flexibility to attend /being available for meetings during UK working hours. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future? If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Introduction Your new team belongs to Production Logistics, which is a part of Group Trucks Operations (GTO). We are anorganization of approximately 650 employees, globally connected to deliver logistics solutions with world classoperational excellence. We ensure that transportation is purchased, packaging is made available at our supplier,material is transported to our production facilities and vehicles are distributed to our customers on time. We designand optimize the Production Logistics supply chain for the Volvo Group, prepare logistics for new products anddrive the Sales & Operations Planning process. We strive for an innovative and diverse workplace, based upon thevalues of Volvo Group always with high focus on customer success." What you will do? Join our team to design and deliver digital solutions using analytics and automation tools that empower logisticsand supply chain operations — all while driving Agile IT Change Management for transportation systems Who’s your new team? We are a tight-knit group with the sole mission of delivering the best supply chain solutions to our Volvo Group North American production plants and Distribution Centers. With our can-do attitude and critical thinking team members, we continue to push the envelope of supply chaindesign, automation, visualization, and lean thinking. We work with a suite of software to create a logistics “digital twin” of our transport system. We handle everythingfrom design (Oracle Transport Management, SAP and Volvo IT systems) to data collection (SQL, Azure, etc.), todata visualization (Power BI, Qlik etc.).Our goal is to ensure smooth flow of information between these systems and our stakeholders so that a quality,low cost, and environmentally conscious transport is secured. This position will work out of Bengaluru, India location and will report to Value Chain Development and AnalyticsManager. What’s your role in the team? Do you dream big? We do too, and we are excited to grow together. You will be responsible for researching new technologies and methods across advanced data analytics and datavisualization. Explore opportunities at the intersection of ML, AI, automation, and big data to drive innovation in logistics networkdesign, visibility, and management. You will take ambiguous problems from our business and create integrated solutions using various structured andunstructured data (SQL, Azure DB, Volvo Systems etc.). The key objective is to proactively inform and manage change of the logistics network design using internal andexternal data. You will do this using automation (to retrieve & combine data sets) and advanced (predictive/prescriptive) analyticsto find correlation or causality within millions of data entries and across several systems. You will collaborate with cross-functional teams to evaluate and implement change requests in IT systems. Afterdeployment, lead regression testing efforts to verify that existing features continue to perform as expected andthat no unintended issues arise. You will be the liaison of adoption of these new technologies by the team. As a change leader, you will find orcreate education material and coach all stakeholders impacted by newfound technologies. Who are you? We are looking for candidates with the following skills, knowledge, and experience Bachelor’s degree in Computer Science, Information Technology, Data Science, Industrial Engineering, or arelated field Strong knowledge of relational databases (i.e. SQL Server, Azure SQL DB) and proficiency in writing complexSQL queries Proven experience with ETL (Extract, Transform, Load) processes for data integration and transformation acrossmultiple systems Proficiency in data visualization tool(s) (i.e. Qlik, PowerBI, etc.) and Microsoft PowerApps products Hands on skill and experience in one programming language used in data science applications (such as Pythonor R) Experience working with IT system change management processes and regression testing Strong verbal and written communication skills, with the ability to clearly convey complex ideas to diversestakeholders Demonstrated ability to operate effectively across time zones and collaborate with globally distributed teams Are you excited to bring your skills and disruptive ideas to the table? We can’t wait to hear from you. Apply today! We value your data privacy and therefore do not accept applications via mail. Who We Are And What We Believe In We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group’s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Trucks Operations encompasses all production of the Group’s manufacturing of Volvo, Renault and Mack trucks, as well as engines and transmissions. We also orchestrate the spare parts distribution for Volvo Group’s customers globally and design, operate and optimize logistics and supply chains for all brands. We count 30,000 employees at 30 plants and 50 distribution centers across the globe. Our global footprint offers an opportunity for an international career in a state-of-the-art industrial environment, where continuous improvement is the foundation. As our planet is facing great challenges, we - one of the largest industrial organizations in the world - stand at the forefront of innovation. We are ready to rise to the challenge. Would you like to join us?
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position Title: Product Manager – Healthcare Data & Analytics About The Role As Product Manager, you will lead the strategy, execution, and commercialization of innovative data and analytics products for the U.S. healthcare market. This is a highly collaborative role where you'll work cross-functionally with Engineering, Sales, Design, and Delivery teams to build scalable, interoperable solutions that address core challenges across payers and providers. You’ll be responsible for partnering with the solution offering manager to deliver on the product vision and roadmap, conducting customer discovery, tracking success metrics, and ensuring timely delivery of high-impact features. This role carries revenue responsibilities and is key to EXL Health’s broader growth agenda. Core Responsibilities Product Strategy & Leadership Develop and own the quarterly roadmap for healthcare data and analytics solution Manage product backlog and ensure alignment with evolving client needs, compliance mandates (e.g., CMS, FHIR), and company objectives Translate customer pain points and regulatory changes into innovative data-driven products and services. Champion a customer-first approach while ensuring technical feasibility and commercial viability. Stay ahead of technology and market trends—especially in AI, value-based care, and Care Management Collaborate closely with Engineering and Design teams to define and prioritize product requirements. Client Engagement & Sales Support Meet directly with clients to shape strategy, gather feedback, and build trusted relationships. Serve as the bridge between client expectations and solution capabilities, ensuring alignment and delivery excellence. Experience Qualifications Minimum 5–8 years of experience in analytics, data platforms, or product management, preferably within the U.S. healthcare ecosystem. At least 3 years in a leadership or client-facing product role, including experience managing end-to-end product development and revenue accountability. Proven success in bringing data or analytics products to market—from ideation through launch and iteration. Healthcare Domain Expertise Deep familiarity with U.S. payer or provider environments, including claims, payments, risk adjustment, population health, or care management. Working knowledge of regulatory and interoperability standards (e.g., CMS 0057, FHIR, TEFCA). Hands-on understanding of how data management, analytics, and AI/ML drive value in clinical or operational workflows. Technical Skills Practical experience with or exposure to: Cloud Platforms: Snowflake, AWS, Azure, GCP BI & Visualization Tools: Tableau, Power BI, Qlik ETL/Data Integration: Informatica, Talend, SSIS, Erwin Data Science/AI/ML: Experience collaborating data science teams on AI initiatives Agile/Tools: Jira, Confluence, Asana, Agile/Scrum methodologies Personal Attributes Strategic thinker who can dive deep into execution. Exceptional written and verbal communication, with the ability to translate technical concepts for non-technical audiences. Strong organizational, problem-solving, and analytical skills. Passion for innovation, continuous improvement, and cross-functional collaboration. A team-first leader with high emotional intelligence and the ability to mentor others. Education Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, Statistics, Business, or a related field from a top-tier institution.
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
1. Skilled in SAS, SQL and Automation. Would prefer if the candidate has experience in visualization tools like Power BI/Qlik/Tableau. Good knowledge of Banking domain (specifically Credit Risk Analytics Domain). The candidate should have excellent written and verbal communication skills Good stakeholder management skills Experience in Marketing Analytics especially tracking Have work experience of working on data science problems. Experience of working in Banking and Credit Risk, Statistics modeling techniques like liner and logistic regression, decision tree and random forest Exposure to IFRS9 model development with parameters like PD, EAD, LGD, understanding of Application and Behavioral Scorecard Development Understanding of collections and recoveries business aspects Good knowledge of SAS and SQL Good communication Skills
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description At Ford Motor Credit Company, we are going support indirect lending for Ford Credit Bank through existing lending platforms and integrating new Ford Credit Bank data into Enterprise Data Warehouse (GCP BQ) for data insights and analytics. This role is for ETL/Data Engineer who can integrate Ford Credit Bank data from existing North America Lending platforms into Enterprise Data Warehouse (GCP BQ), To enable critical regulatory reporting, operational analytics, and risk analytics. You will be responsible for deep-dive analysis of Current State Receivables and Originations data in a Data warehouse, as well as impact analysis related to Ford Credit Bank and providing solutions for implementation. You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications, integrating it into analytical domains, and building data marts & products in GCP. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform, Mainframe, and IBM DataStage. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right data warehouse solutions. Responsibilities Develop and modify existing data pipelines on Mainframe (JCL, COBOL), IBM DataStage and BigQuery to integrate Ford Credit Bank data into Enterprise Data Warehouse (GCP BQ) and support production deployment. Use APIs for data processing, as required Implement architecture provided by data architecture team. Will be using Fiserv bank features and mainframe data sets for enabling banks data strategy. Be proactive and implement design plans. Will be using DB2 for performing bank integrations. Prepare test plan and execution within EDW/Data Factory (end-to-end, from ingestion, to integration, to marts) to support use cases. Design and build production data engineering solutions to deliver reusable patterns using Mainframe JCL, Datastage, autosys Design and build production data engineering solutions to deliver reusable patterns using Google Cloud Platform (GCP) services: Big Query, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud build and App Engine, and real-time data streaming platforms like Apache Kafka, GCP Pub/Sub and Qlik Replicate Collaborate with stakeholders and cross-functional teams to gather and define data requirements, ensuring alignment with business objectives. Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage. Perform necessary data mapping, impact analysis for changes, root cause analysis, data lineage activities, and document information flows. Develop and maintain documentation for data engineering processes, standards, and best practices, ensuring knowledge transfer and ease of system maintenance. Implement an enterprise data governance model and actively promote the concept of data - protection, sharing, reuse, quality, and standards to ensure the integrity and confidentiality of data. Work in an agile product team to deliver code frequently using Test Driven Development (TDD), continuous integration, and continuous deployment (CI/CD). Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Continuously enhance your FMCC domain knowledge, stay current on the latest data engineering practices, and contribute to the company's technical direction while maintaining a customer-centric approach. Qualifications Successfully designed and implemented data warehouses and ETL processes for over 5 years, delivering high-quality data solutions. Exposure to Fiserv banking solution is desired [VN1] [VG2] 8+ years of complex BigQuery SQL development experience, Mainframe (JCL, COBOL),gszutil, and DataStage job development. Experienced with Mainframe, Datastage, Autosys Experienced with Mainframe file formats, COBOL copybooks, ORC Formats, JCL scripts, and related technologies to manage legacy data ingestion. Design, develop, and maintain ETL processes using IBM DataStage to extract, transform, and load data from Mainframe systems and other sources such as SQL, Oracle, Postgres, AS400, MF DB2 into the data warehouse. [VN3] [VG4] Develop and modify batch scripts and workflows using Autosys to schedule and automate ETL jobs. Experienced cloud engineer with 5+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Cloud Build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Desired: Professional Certification in GCP (e.g., Professional Data Engineer) Master’s degree in computer science, software engineering, information systems, Data Engineering, or a related field. Data engineering or development experience gained in a regulated financial environment. Experience with Teradata to GCP migrations is a plus.
Posted 1 month ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Lead Assistant Vice President , Risk and Compliance AI and Analytics Principal Responsibilities The individual will be responsible for reporting RC AI & Analytics scorecard and key performance indicators in a timely and accurate manner. Promote a culture of data driven decision making, aligning short term decisions and investments with longer term vision and objectives. Help the business to manage regulatory risk in a more effective, efficient, and commercial way through the adoption of data science (AI/ML and advanced analytics) Support communication and engagement with stakeholders and partners to increase understanding and adoption of data science products and services also research opportunities. Collaborate with other analytics teams across the banks to share insight and best practice. Foster a collaborative, open and agile delivery culture. Build positive momentum for change across the organization with the active support and buy-in of all stakeholders. The ability to communicate often complex analytical solutions to the wider department, ensuring a strong transfer of key findings & intelligence. Requirements University degree in technology, data analytics or related discipline or relevant work experience in computer or Data Science Understanding of Regulatory Compliance, risks and direct experience of deployment of controls and analytics to manage those risks. Experience in Financial Services (experience within a tier one bank) or related industry, Knowledge of the HSBC Group structure, its business and personnel, and HSBC’s corporate culture Experience of agile development and active contribution to strategy and innovation. Solid understanding of applied mathematics, statistics, data science principles and advanced computing (machine learning, modelling, NLP and Generative AI) Experience working within the Hadoop ecosystem in addition to strong technical skills in analytical languages such as Python and SQL. Specific knowledge of GCP, AWS, Azure, Spark and/or graph theory an advantage. Experience of visualization tools and techniques including Qlik and Tableau Solid understanding of data & architecture concepts including cloud, ETL, Ontology, Data Modelling. Experience of using JIRA, GIT, Confluence, Teams, Slack, Advanced Excel You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 1 month ago
3.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary We are seeking a highly-skilled and experienced Marketing Cloud Testing team to join our team Marketing Automation team who works closely with brand teams; understands various data sources, adept in building data ingestion pipelines, skilled in testing end-to-end data ingestion layers, data models and visualization dashboards based on previously built test scripts. About The Role Key Responsibilities: Build e2e test scripts for each release based on user epics across the data value chain – ingestion, data model and visualization Post development, run the test scripts using any of testing platforms viz Proton etc Document results and highlight any bugs / errors to development team and work closely with development team to resolve the issues Must audit technical developments and solutions and validate matching of source data with MCI Additional responsibilities may include creating and updating knowledge documents in the repository as needed. Work closely with Technical Lead and Business Analysts to help design testing strategy and testing design as part of pre-build activities Participate in data exploration and data mapping activities along with technical lead and business and DDIT architects for any new data ingestion needs from business along with Development team Build and maintain standard SOPs to run smooth operations that enable proper upkeep of visualization data and insights Qualifications Minimum of 3-4 years of experience in Dataroma / MCI as hands on developer Prior experience in any of visualization platforms viz Tableau, Qlik, Power BI as core developer is a plus Experience of working on Data Cloud and other data platforms is a plus Hand-on experience in using any ETL tools such as Informatica, Alteryx, DataIKU preferred Prior experience in testing automation platforms preferred Excellent written and verbal skills. Strong interpersonal and analytical skills Ability to provide efficient, timely, reliable, and courteous service to customers. Ability to effectively present information Demonstrated knowledge of the Data Engineering & Business Intelligence ecosystem Salesforce MCI certification. Familiarity with AppExchange deployment, Flow, Aura component and Lightning Web component will be a plus Commitment To Diversity And Inclusion Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility And Accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 1 month ago
7.0 years
10 - 18 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for a highly skilled Senior Data Engineer / BI Consultant with 7+ years of experience in implementing commercial projects across Data Engineering, Business Intelligence, ETL, and Data Warehousing. The ideal candidate will have strong hands-on expertise in SQL, ETL tools, BI platforms, and modern database environments. Responsibilities Lead end-to-end implementation of data projects in enterprise environments. Design and develop robust and scalable ETL pipelines using industry-leading ETL tools. Analyze business requirements and translate them into technical solutions using data models and visualizations. Develop and manage reporting dashboards using BI tools like SAP Analytics Cloud, Power BI, Tableau, or Qlik. Work with relational and columnar databases (e.g., Oracle, SAP BW, Teradata, Exasol) for efficient data processing. Write optimized SQL for data extraction, transformation, and reporting purposes. Collaborate with stakeholders to ensure timely delivery and effective business insights. Implement data governance, metadata management, and security protocols in reporting and data warehousing. Perform unit testing, peer reviews, and performance optimization of ETL and BI solutions. Required Experience And Qualifications 7+ years of hands-on experience in data engineering, BI, or data science projects. Proficiency in SQL: including DML, DDL, DCL, and TCL operations. Experience in BI tools: such as SAP Analytics Cloud, Tableau, Power BI, Qlik, or MicroStrategy. ETL/Data Integration tools expertise: Informatica, Dataiku, Oracle Data Integrator, Talend, Pentaho DI, or IBM DataStage. Strong knowledge of data warehousing platforms: Oracle, Teradata, SAP BW, or Exasol. Ability to work independently and manage multiple priorities in a fast-paced environment. Solid understanding of data modeling, performance tuning, and data lifecycle management. Preferred Skills Knowledge of cloud platforms (AWS, Azure, GCP) and integration with data/BI solutions. Familiarity with big data technologies (Spark, Hive, HDFS) is a plus. Understanding of DevOps practices and CI/CD in data project delivery. Experience with version control tools like Git. Skills: etl,qlik,talend,oracle data integrator,bi platforms,exasol,hive,devops practices,pentaho di,ibm datastage,ci/cd,aws,performance tuning,informatica,tableau,power bi,data lifecycle management,git,data modeling,hdfs,teradata,sql,etl tools,dataiku,oracle,gcp,azure,data warehousing,spark,sap analytics cloud,sap bw
Posted 1 month ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute As Manager, Data Analytics and Automation – Internal Audit, you will lead the data analytics and automation function within the Internal Audit department at Ciena. You will be responsible for managing a team of data analysts and driving the strategic use of data to enhance audit effectiveness, risk assessment, and business process improvement. Additionally, you will partner closely with audit leadership and business stakeholders to embed advanced analytics, automation, artificial intelligence, continuous auditing and monitoring, and digital tools into all phases of the audit lifecycle, supporting the company’s mission of operational excellence and continuous innovation. Key Responsibilities Lead, mentor, and develop a team of data analysts, fostering a culture of innovation, collaboration, and continuous learning, specifically building Artificial Intelligence (AI) and Machine Learning (ML) capabilities within the team. Develop and champion the strategic roadmap for integrating AI/ML into internal audit processes, including the identification of high-impact use cases for data analytics, continuous auditing and risk assessment. Oversee the design and execution of data analytics strategies, incorporating AI/ML techniques and continuous auditing approaches, that support audit planning, execution, and reporting. Collaborate with the internal audit team and business leaders to identify opportunities for data-driven insights, automation, and process improvements. Ensure the integrity, quality, and security of data used in audit analytics, maintaining compliance with company policies and regulatory requirements. Translate complex AI/ML findings, continuous auditing outputs, and data analytics observations into actionable insights and clear recommendations for audit leadership and business stakeholders, fostering data-driven decision-making Develop and maintain advanced dashboards, visualizations, and analytical models to communicate key findings and trends to stakeholders at all levels. Drive the adoption of emerging technologies (AI/ML, automation) within the audit function, recommending and implementing innovative solutions. Manage multiple concurrent projects, allocating resources and setting priorities to deliver high-impact results on time. Stay current with industry trends, regulatory changes, and best practices in data analytics, automation, internal audit, risk management, and the application of AI in these fields. Support the integration of data analytics, automation, and AI into the audit methodology, ensuring each are embedded in risk assessment, control testing, and reporting. Build strong relationships with IT, Finance, and other business partners to facilitate access to data and alignment on analytics initiatives. The Must Haves Education: Bachelor’s degree in Data Science, Computer Science, Information Systems, Business Analytics, or a related field. Master’s degree preferred. Experience: 7+ years of progressive experience in data analytics and automation, with at least 2 years in a leadership or managerial role, preferably within internal audit, risk management, or a high-tech environment. Advanced proficiency with data analytics tools, including SQL (Snowflake experience preferred), Python, R, Alteryx, or similar tools, and data visualization platforms (e.g., Power BI, Tableau, Qlik). Experience extracting and transforming data from large data lakes to derive actionable insights using advanced analytical techniques is essential. Strong understanding of ERP systems (Oracle, SAP), cloud platforms, and business process data flows. Demonstrated ability to lead teams, manage complex projects, and deliver data-driven insights that influence business decisions. In-depth knowledge of internal controls, audit methodologies, and risk management frameworks is a strong asset. Excellent communication, stakeholder management, and problem-solving skills. High ethical standards, attention to detail, and commitment to confidentiality and data security. Assets Experience working in a global, high-tech, or rapidly evolving business environment. Familiarity with regulatory requirements (SOX, GDPR, CCPA) and audit standards. Innovative mindset and passion for driving digital transformation within the audit function. Proven experience in designing and implementing automation and AI/ML solutions or continuous auditing programs within an internal audit, risk management, or compliance function This role is ideal for a dynamic leader who combines technical expertise with strategic vision, ready to elevate the impact of data analytics, AI, and automation in internal audit at a leading technology company Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.
Posted 1 month ago
3.0 - 4.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
The purpose of this role is to assist in the implementation and support of our business intelligence solutions. The role will be responsible for understanding data, data structures and BI reports, as well as general marketing technologies. The other part of responsibilities is to undertake Tableau Server Administration tasks. Job Description: Location: Mumbai Shift timing: 1PM - 10PM / 2PM - 11PM Experience: 3 to 4 years Key responsibilities: Build reports and dashboards independently Quality checks of reports and dashboards Enhancements and bug fixes to dashboards and manages the implementation Develop underlying data structures necessary to support dashboard development Maintain development and project level documentation Tableau Server Administration - user access management, involvement in server updates, upgrades & migrations Manage permissions and security of reporting environments Proactive, regular communication with internal and external stakeholders Present dashboards to clients and train them on using the dashboards. Conduct User Acceptance Testing (UAT) and training sessions. Must Haves: Tableau Dashboard development skills including the use of parameterized dynamic metrics and dimension, time intelligence calculations, dynamic zone visibility, the ability to understand the customers' business needs and design dashboards to satisfy those needs. Tableau Server Administration Professional, fluent communication skills - written and verbal. Proficiency in business intelligence and data analytics concepts. Good understanding of dashboard design best practices. Intermediate to expert level proficiency in SQL and data modelling for BI. Collaborative Spirit: Thrive in an interpersonal, collaborative, and communicative environment that values your unique perspective. Critical Thinking: Bring your analytical thinking to the table, helping us solve complex problems and make data-driven decisions. Good To Have: Media knowledge – knowledge of the digital media marketing landscape. Additional tools: Knowledge or experience with other tools such as PowerBI & Qlik is a bonus. Embedding Capabilities: Understanding the embedding capabilities for Tableau. Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of AI/ML (Mach1ML) platform. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Mach1ML platform – an AI/ML enablement platform to democratize Machine Learning across Ford enterprise (like OpenAI’s GPT, Facebook’s FBLearner, etc.) to deliver next-gen analytics innovation. We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success. Responsibilities What you'll be able to do: As a Software Engineer, you will work on developing features for Mach1ML platform, support customers in model deployment using Mach1ML platform on GCP and On-prem. You will follow Rally to manage your work. You will incorporate an understanding of product functionality and customer perspective for model deployment. You will work on the cutting-edge technologies such as GCP, Kubernetes, Docker, Seldon, Tekton, Airflow, Rally, etc. Position Responsibilities: Work closely with Tech Anchor, Product Manager and Product Owner to deliver machine learning use cases using Ford Agile Framework. Work with Data Scientists and ML engineers to tackle challenging AI problems. Work specifically on the Deploy team to drive model deployment and AI/ML adoption with other internal and external systems. Help innovate by researching state-of-the-art deployment tools and share knowledge with the team. Lead by example in use of Paired Programming for cross training/upskilling, problem solving, and speed to delivery. Leverage latest GCP, CICD, ML technologies Critical Thinking: Able to influence the strategic direction of the company by finding opportunities in large, rich data sets and crafting and implementing data driven strategies that fuel growth including cost savings, revenue, and profit. Modelling: Assessments, and evaluating impacts of missing/unusable data, design and select features, develop, and implement statistical/predictive models using advanced algorithms on diverse sources of data and testing and validation of models, such as forecasting, natural language processing, pattern recognition, machine vision, supervised and unsupervised classification, decision trees, neural networks, etc. Analytics: Leverage rigorous analytical and statistical techniques to identify trends and relationships between different components of data, draw appropriate conclusions and translate analytical findings and recommendations into business strategies or engineering decisions - with statistical confidence Data Engineering: Experience with crafting ETL processes to source and link data in preparation for Model/Algorithm development. This includes domain expertise of data sets in the environment, third-party data evaluations, data quality Visualization: Build visualizations to connect disparate data, find patterns and tell engaging stories. This includes both scientific visualization as well as geographic using applications such as Seaborn, Qlik Sense/PowerBI/Tableau/Looker Studio, etc. Qualifications Minimum Requirements we seek: Bachelor’s or master’s degree in computer science engineering or related field or a combination of education and equivalent experience. 3+ years of experience in full stack software development 3+ years’ experience in Cloud technologies & services, preferably GCP 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. 3+ years’ experience with Python, SQL, BQ. Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, Streamlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Our Preferred Requirements: Master’s degree in computer science engineering, or related field or a combination of education and equivalent experience. Demonstrated successful application of analytical methods and machine learning techniques with measurable impact on product/design/business/strategy. Proficiency in programming languages such as Python with a strong emphasis on machine learning libraries, generative AI frameworks, and monitoring tools. Utilize tools and technologies such as TensorFlow, PyTorch, scikit-learn, and other machine learning libraries to build and deploy machine learning solutions on cloud platforms. Design and implement cloud infrastructure using technologies such as Kubernetes, Terraform, and Tekton to support scalable and reliable deployment of machine learning models, generative AI models, and applications. Integrate machine learning and generative AI models into production systems on cloud platforms such as Google Cloud Platform (GCP) and ensure scalability, performance, and proactive monitoring. Implement monitoring solutions to track the performance, health, and security of systems and applications, utilizing tools such as Prometheus, Grafana, and other relevant monitoring tools. Conduct code reviews and provide constructive feedback to team members on machine learning-related projects. Knowledge and experience in agentic workflow based application development and DevOps Stay up to date with the latest trends and advancements in machine learning and data science.
Posted 1 month ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Summary: Position : Senior Power BI Developer Experience : 5+ Years Location : Ahmedabad - WFO Key Responsibilities: Design, develop, and maintain interactive and user-friendly Power BI dashboards and reports. Translate business requirements into functional and technical specifications. Perform data modeling, DAX calculations, and Power Query transformations. Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs. Optimize Power BI datasets, reports, and dashboards for performance and usability. Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy and relevance. Ensure security and governance best practices in Power BI workspaces and datasets. Provide ongoing support and troubleshooting for existing Power BI solutions. Stay updated with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or a related field. 5+ years of professional experience in data analytics or business intelligence. 5+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service). Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema). Proficiency in writing complex SQL queries and optimizing them for performance. Experience in working with large and complex datasets. Experience in BigQuery, MySql, Looker Studio is a plus. Ecommerce Industry Experience will be an added advantage. Solid understanding of data warehousing concepts and ETL processes. Experience with version control tools such as Power Apps & Power Automate would be a plus. Preferred Qualifications: Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse). Knowledge of other BI tools (Tableau, Qlik) is a plus. Familiarity with scripting languages (Python, R) for data analysis is a bonus. Experience integrating Power BI into web portals using Power BI Embedded.
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Description Job Objective: To contribute with strong problem-solving skills and process orientation on the projects independently. To help the team with required trainings and mentor new joiners. Designation : Senior Business Analyst Job Location: Bangalore Type of employment: Permanent Roles & Responsibilities: Provide insights through data analysis and visualization for small to large datasets Ability to translate a business questions into analytical problem to develop business r ules, process flow and methodology for analysis Ability to summarize the analysis using basic statistical methods Work with team to execute Adhoc/ Regular reporting projects Requirements: 2+ years of professional experience is required Must be well verse with MS Excel, Word and PowerPoint Technically: Must - Experience with working on database query language ( SQL ) Good to have - Python, VBA, any visualization tool (Tableau, Qlik, Power BI) etc. Hands on experience in data analytics, data wrangling. Good to have: Patient level data analytics (RWD Data Lake, Optum/ DRG Claims, IQVIA APLD, IQVIA LAAD) Experience in analyzing IQVIA/IMS data (patient insights, MIDAS) Marketing analytics (Market assessment, Forecasting, competitive intelligence), Sales Analytics (sizing, structuring, segmentation) etc. Additional Skills: Ability to work independently across teams Passion for solving challenging analytical problems Ability to assess a problem quickly, qualitatively, and quantitatively Ability to work productively with team members, identify and resolve tough issues in a collaborative manner Should have good communication skills Qualifications B.E Graduates
Posted 1 month ago
5.0 years
0 Lacs
Block 5, Karnataka, India
On-site
Overview Qualification: BS/BA degree in Computer Science, Engineering, Mathematics, Data Analytics, Statistics, or related fields 5+ years of relevant experience Expertise with BI tools such as Qlik Sense, Power BI, or Tableau and a solid understanding of visualization principles Knowledge of SQL or Python for data analysis and automation Familiarity with ETL processes and integrating APIs with BI tools Strong Microsoft Office skills (Excel, PowerPoint, and Outlook) Excellent communication skills (both verbal and written) with the ability to engage technical and non-technical stakeholders Highly self-motivated with a proactive, independent work style and strong problem-solving ability Strong attention to detail with a focus on accuracy in data handling and reporting Total Experience 5+ yrs Role As a Business Intelligence (BI) Developer at Terralogic, you will play a pivotal role in developing BI solutions for financial management. Your primary responsibility will be to design, develop, and maintain financial dashboards and data models that enable business leaders to make informed decisions through automated data consolidation and visualization. You will integrate multiple data sources (including ERP and CRM systems), ensuring seamless data flow, data integrity, and optimal performance of BI tools. This role offers the unique opportunity to work on a new project with minimal oversight, requiring strong self-motivation and an ability to take high-level direction and independently troubleshoot and deliver results. Responsibilities Collaborate with global business teams to understand their needs and translate them into technical solutions Work independently to execute deliverables and create impactful dashboards from scratch Consolidate financial data into databases, ensuring seamless integration of multiple data sources (including CRM and ERP platforms) Develop interactive financial dashboards and reports that visualize key performance metrics Create continuous enhancements to improve visualization techniques, integrate additional data, and enhance dashboard usability Maintain and optimize BI tools’ performance by troubleshooting issues and implementing solutions to ensure smooth operations Ensure data accuracy and integrity, implementing best practices for data governance and performance optimization Document BI development processes and maintain clear, comprehensive technical documentation for ongoing projects Stay up to date with industry trends, tools, and technologies to continuously improve BI solutions Apply Now
Posted 1 month ago
45.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities Design, build, and maintain dynamic and user-friendly dashboards using Tableau and Looker Develop and optimize LookML models in Looker and manage data exploration layers for reusable insights Use Tableau Prep and Looker data modelling to cleanse, shape, and transform data Collaborate with data engineers, analysts, and business stakeholders to understand reporting requirements and deliver relevant visualizations Ensure data accuracy, performance tuning, and alignment with business KPIs and metrics Maintain version control, content organization, and user access in Tableau Server and Looker environments Implement best practices in visualization design, layout, interactivity, and storytelling Support the migration and redesign of legacy dashboards (e.g., from Power BI or Qlik) into Tableau or Looker as required Automate reporting workflows and enable self-service BI through guided dashboards Collaborate with IT or data platform teams to troubleshoot connectivity, access, and integration issues Requirements 45 years of hands-on experience in data visualisation using Tableau and Looker Strong experience with LookML, dashboard development, and explore setup in Looker Expertise in Tableau Desktop and Tableau Server/Public, with solid knowledge of calculated fields, parameters, filters, and Tableau dashboards Good understanding of data modelling, SQL, and working with large, complex datasets (preferably in Snowflake, BigQuery, or SQL Server) Familiarity with data governance, permissions management, and user licensing in Tableau and Looker Ability to apply design principles, storytelling techniques, and UX best practices in dashboards Strong communication and stakeholder management skills Bonus : Experience with Power BI, Power Automate, or migration projects from other BI (ref:hirist.tech)
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Should have very good understanding of US insurance and Insurance Finance Should have worked for various LOB’s such as Auto, GL, Property, WC etc. Should have fair amount of exposure on both Commercial and Personal insurance side Very good knowledge and understanding of insurance terminology Stateside trainers for the purpose of training, resolving more complex issues, seeking advice or requesting a reassignment to the state side staff when a request becomes out of scope Interacting with the Onshore team for any new changes in regulation or circular update Advanced skills in MS Office, MS Excel, MS Word Accounting System or ERP, Web based applications Exposure to BI platform such as Tableau, Qlik etc. and Oracle systems like Essbase, Smartview Excellent oral/written communication skills, presentation skills- MANDATORY Excellent organization and time management skills Excellent analytical skills and competent at logical reasoning Must be a self-starter, detail oriented with the ability to meet deadlines under pressure Able to prioritize multiple activities and projects Self-disciplined and result oriented Demonstrate attention to detail in a fast-paced work environment – especially during processing Ability to multi task Ability to work effectively as part of a team Commitment and drive for results Strong analytical skills Ability to understand and question established process guidelines in order to bring about possible process improvements
Posted 1 month ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Project Manager – Data Analytics Location: Noida/Bengaluru Job Type: Full-time Experience: 12+ years Industry: IT/Analytics/Business Consulting Services Overview: We are looking for an experienced Project Manager to lead end-to-end delivery of Data Analytics projects. The ideal candidate should bring a strong mix of technical knowledge, stakeholder management, and delivery leadership to successfully manage multiple projects in a fast-paced consulting environment. You will be the primary point of contact between clients, technical teams, and business units—ensuring timelines, quality, and value creation. Roles and Responsibilities: Own project delivery for data analytics engagements—from scoping, planning, execution to closure. Drive stakeholder discussions and manage expectations across client and internal leadership. Collaborate with Data Engineers, BI Developers, Business Analysts, and QA to ensure deliverables are aligned with business objectives. Manage project plans, resourcing, timelines, risk registers, and communication plans. Ensure adherence to delivery frameworks (Agile/Scrum/Waterfall) based on project context. Track KPIs around effort estimation, quality, timelines, and budget. Conduct regular sprint reviews, retrospectives, and status meetings. Help in identifying opportunities for value addition, change requests, and future enhancements. Contribute to pre-sales solutioning, project estimation, and proposal writing as needed. Required Professional Expertise: 10–14 years of experience, with at least 4+ years in project management for Data/BI/Analytics solutions. Proven success in managing end-to-end delivery of analytics projects in a consulting environment. Strong understanding of modern data platforms, ETL processes, data warehousing, and BI tools (e.g., Power BI, Tableau, Azure Data Factory, Snowflake, etc.). Excellent communication, documentation, and client-facing skills. Proficient in using project management tools such as JIRA, MS Project, Smartsheet, or equivalent. PMP, CSM, or SAFe certifications are a plus. Bachelor’s degree in computer science, Engineering, or related discipline. MBA or equivalent business qualification preferred. About Polestar As a data analytics and enterprise planning powerhouse, Polestar Solutions helps its customers bring out the most sophisticated insights from their data in a value-oriented manner. From analytics foundation to analytics innovation initiatives, we offer a comprehensive range of services that help businesses succeed with data. We have a geographic presence in the United States (Dallas, Manhattan, New York, Delaware), UK(London) & India (Delhi-NCR, Mumbai, Bangalore & Kolkata) and have 600+ people strong world-class team. We are growing at a rapid pace and plan to double our growth each year. This provides immense growth and learning opportunities to those who are choosing to work with Polestar. Our expertise and deep passion for what we do has brought us many accolades. The list includes: Recognized as “Great Place to Work” - 2024 Recognized as the Top 50 Companies for Data Scientists in 2023 by AIM. Financial Times awarded Polestar as High-Growth Companies across Asia-Pacific for a 5th time in a row in 2023. Featured on the Economic Times India's Growth Champions in FY2023. Polestar Analytics Selected as a 2022 Red Herring Global Top Data Science Providers in India 2022: Penetration and Maturity (PeMa) Quadrant FT ranking of 500 of the Asia-Pacific region’s high-growth companies. India’s most promising data science companies in 2022. Featured on Forrester's Now Tech: Customer Analytics Service Providers Report Q2, 2021. Recognized as Anaplan's India RSI Partner of the Year FY21. Elite Qlik Partner and a member of the ‘Qlik Partner Advisory Council’ & Microsoft Gold Partners for Data & Cloud Platforms Culture at Polestar. We have one of the most progressive people’s practices which are all aimed at enabling fast-paced growth for those who deserve it.
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Reference ID R181304 Updated 06/19/2025 Health, Safety, Security, and Environment India Chennai N/A What’s The Role The SEAM organization integrates Safety, Environment & Asset Management activities, with a broad geographical footprint, that supports Shell’s business & assets around the world. Technical Asset Services (TAS), which sits in SEAM, is a key enabler for the accelerated delivery of Shell Performance Framework to reach Shell’s ultimate potential in Chemicals and Products (C+P), Integrated Gas/Renewables and Energy Solutions (IGRES) and Upstream. TAS provides high quality and cost-competitive technical resources who are supporting our Shell sites remotely yet are an integral part of asset teams delivering value through end-to-end AMS work processes. The TAO teams are located across 4 locations: Chennai, Manila, KL & Krakow. Asset Safety Reporting is Part of TAS supporting the Global Safety Reporting process, ensuring top quartile and on-time delivery of the safety reporting data and insights, ensure Process compliance, leveraging where applicable, the group and TAS specific Digitalization journeys. Global digitalization initiatives will include but not be limited to FIM, AMDP, Sphera, Jarvis, HSSE Data & Analytics Platform (DAP), as well as transformation towards application of AI/ML technology to further optimize the digital data diagnostics and integrated reporting process delivery. What You Will Be Doing The purpose of this role is to support safety reporting delivery at Asset level with particular focus on Timely and Quality Safety Performance reporting submissions to Central Reporting Team and Asset LT (monthly, quarterly) and effective QA/QC processes execution. Support for reports with Asset data flowing from Sphera (Incident Data) Delivery of data for External Organizations Complete the OSHA 300 for asset, HWC, RTC and send signed copies to the focal points for the assets. (Regulatory to submit data to OSHA)- Annual Report Complete BSEE0131 and COS data submittals - Annual Reports Complete quarterly reports for JV connects. Annual COS data submission from Power BI, Sphera and other sources Review/remind Action Item owners/responsible parties of their actions - monthly basis Create Sphera "How to" guidance tutorials for as topics that need revising. Improvements and updates to existing dashboards as needed. Incorporate process safety walk and process fundamental KPI data into the HSE dashboard. Support Ops Safety Team with AI deployment & special application projects. Help develop Learnings from Incident apps (Power Apps) to support distribution and visibility of learnings through the organization. Monitoring events in SPHERA and QA/QC checks. Assist with the continuous improvement and implementation of standardized work processes for efficiency in process development and delivery. What You Bring Minimum of 4 to 5+ years of experience in an HSSE role Bachelor’s degree in Engineering or equivalent Certified in NEBOSH IGC or equivalent HSE certification Strong aptitude for Learner Mindset Bird-eye view of HSSE reporting process, AIPSM, PMR, Responsible Care, HSSE & SP CFW and the overlaps of these Skilled in data and reporting systems, e.g., Sphera Cloud, Qlik, DAP, etc. Understanding of the Upstream/DS/IG business and how it works will be beneficial Ability to engage and effectively communicate at all levels inside and outside Shell Proven track record of delivery in Asset and/or Project leadership Experience in project management and facilitating continuous improvement High analytical and problem-solving skills, attention to detail Strong English communication skills - role requires interaction with senior leadership
Posted 1 month ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Job Role : -Power BI Developer Job Location : Remote (India) Experience : - 6-7 Years Job Roles & Responsibilities: - Design, build, and maintain enterprise-grade interactive Power BI dashboards and reports aligned with strategic goals Partner with stakeholders to elicit business needs, translate them to data models and visual analytics Create advanced DAX formulas and Power Query (M) scripts to define KPIs and support calculations Integrate Power BI with SQL Server, Excel, SharePoint, Azure, and other data sources Optimize model and report performance using best practices (e.g., query folding, incremental refresh) Implement row-level security and manage user access in Power BI Service Coordinate with data engineering teams to ensure data integrity, cleanliness, and refresh pipelines Publish reports to Power BI Service, configure data gateways and schedule refreshes Guide and support business users in dashboard adoption and insights interpretation Collaborate across product, analyst, and engineering teams in an Agile/Scrum environment Job Requirements & Skills : - Bachelor’s in Computer Science, Information Systems, or equivalent 6+ years of Power BI report/dashboard development experience Expert-level skills in DAX and Power Query (M) Strong SQL proficiency and relational database experience Solid understanding of data modeling, ETL, and data warehousing concepts Familiarity with Power BI Service, including gateway config and workspace management Knowledge of REST APIs and R/Python integration (preferred) Proficient in report performance optimization and BI best practices Excellent analytical, problem-solving, communication, and stakeholder management skills Preferred Qualification : - Microsoft Power BI Certification (PL‑300 or DA‑100) Experience with Azure Data Factory, Synapse, or equivalent cloud data platforms Experience in Agile/Scrum project delivery Familiarity with other BI suites like Tableau, Qlik, or SAP BO
Posted 1 month ago
12.0 - 15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
When you join Accurate Background, you’re an integral part of making every hire the start of a success story. Your contributions will help us fulfill our mission of advancing the background screening experience through visibility and insights, empowering our clients to make smarter, unbiased decisions. Accurate Background is a fast-growing organization, focused on providing employment background screenings and building trustful relationships with our clients. Accurate Background continues to exceed expectations by offering an array of innovative and cutting-edge background check and credentialing products to meet the needs of human resource, loss prevention, and security/legal professionals in employment screening and vendor certification. Lead Data/BI Architect, where you will take charge of developing and executing the enterprise-wide data & analytics strategy. As a senior leader within the organization, you will guide a small data team across integration and BI functions, shaping the data landscape and driving the company towards a single source of truth. This role demands a highly skilled and experienced individual who can deliver a comprehensive data strategy while ensuring robust governance, architecture, and integration across the accurate enterprise. Lead Data Architect will oversee data governance, security, integration, and data quality initiatives. You will be instrumental in defining architectural design patterns, Data standards, and best practices while leading the team in implementing scalable, optimized, and secure data solutions that support business intelligence objectives. Responsibilities: Should be able to understand different cloud Platforms/Architecture, preferably on AWS Platform Services for Data & Analytics Architect end-to-end data solutions on data lakes, warehouses, and real-time analytics. Should have Designed/Architected Medium to Large Data Warehouses Should have created Conceptual, Logical, Physical Data Models, OLTP, OLAP/Dimensional Data Models, Data Analysis Should be aware of Data Architecture/Design Patterns in the areas of Data Ingestion / Curation / Data Consumption / Reporting Semantic Models Should have participated in Defining Data Strategy/Roadmaps in Data & Analytics Should be hands on in some of the areas of the required Technology tools. Should be aware of Data fabric, Data Ingestion Tools, Data Quality Management, Metadata Management, Data Lineage, Data Security Should have good experience in Designing Reusable utilities Should have good experience with Data Warehouse Migrations Should be aware of Agile Methodologies/Data Products, leading teams technically from Design to Development to Deployment, through DevOps, DataOps Automate data quality checks and validations to maintain high data integrity. Monitor, troubleshoot, and resolve issues across data platforms Qualifications: Should have at least 12-15 years of total IT experience in Software Development, with 5 years exclusively in Design & Architecting Data Warehousing projects Should have good hands-on below tools & technologies Must Have: Data Lake Architecture / Data Fabric Snowflake (Tasks, Streams, Stored Procs, Snow pipes), SQL Server Data Modeling tools (like Erwin) Data Warehouse Migrations Agile Methodologies Replication tools (like AWS DMS, Qlik Replicate) OLAP/Dimensional Data Models NoSQL Databases (like MongoDB) Good to have ETL Tools (like SSIS, dBT) BI/Reporting tools (Power BI, Tableau) Cloud Platforms (AWS, Microsoft Fabric) Real-Time databases (Cassandra, DynamoDB) Solid understanding of the Agile development process and software release processes. Must be a self-starter who is highly organized, hands-on, and a team player. Should be able to create Design Documents/Mapping documents (either PPT or Word document) Should be able to communicate & Collaborate with all stakeholders (Director level, Business Units, other Architects, Product Managers, Scrum Stay updated on industry trends to continuously improve data systems and processes. The Accurate Way: We offer a fun, fast-paced environment, with lots of room for growth. We have an unwavering commitment to diversity, ensuring everyone has a complete sense of belonging here. To do this, we follow four guiding principles – Take Ownership, Be Open, Stay Curious, Work as One – core values that dictate what we stand for, and how we behave. Take ownership. Be accountable for your actions, your team, and the company. Accept responsibility willingly, especially when it’s what’s best for our customers. Give others every reason to trust you, believe in you, and count on you. Rise to every occasion with your personal best. Be open. Be open to new ideas. Be inclusive of people and ways of doing things. Make yourself accessible and approachable, and communicate with genuineness, transparency, honesty, and respect. Embrace differences. Stay curious. Stay curious even as you move forward. Tirelessly ask questions and challenge the status quo in your pursuit of new ideas, ways to solve problems, and to continually grow and improve. Work as one. Work together to create the best customer and workplace experience. Put our customers and employees first—before individual or departmental agendas. Make sure they get the help they need to succeed. About Accurate Background: Accurate Background’s vision is to make every hire the start of a success story. As a trusted provider of employment background screening and workforce monitoring services, Accurate Background gives companies of all sizes the confidence to make smarter, unbiased hiring decisions at the speed of demand. Experience a new standard of support with a dedicated team, comprehensive technology and insight, and the most extensive coverage and search options to advance your business while keeping your brand and people safe. Special Notice: Accurate is aware of schemes involving fraudulent job postings/offers and/or individuals or entities claiming to be employees of Accurate. Those involved are offering fabricated employment opportunities to applicants, often asking for sensitive personal and financial information. If you believe you have been contacted by anyone misrepresenting themselves as an employee of Accurate, please contact humanresources@accurate.com. Please be advised that all legitimate correspondence from an Accurate employee will come from "@accurate.com" email accounts. Accurate will not interview candidates via text or email. Our interviews are conducted by recruiters and leaders via the phone, Zoom/Teams or in an in-person format. Accurate will never ask candidates to make any type of personal financial investment related to gaining employment with the Company.
Posted 1 month ago
40.0 years
0 Lacs
India
On-site
Immediate Joiner Only - Must Have : Python, SQL and PySpark Overview: UsefulBI is looking for highly skilled candidates, with expertise in generating power business insights from very large datasets, in this space where the primary aim would be to enable needle moving business impacts through cutting edge statistical analysis. We are looking for passionate data engineers who can envision the design and development of analytical infrastructure which can support strategic and tactical decision-making. Experience Required: Minimum 4+ years Experience in Data Engineering. Must have good knowledge and experience in Python . Mush have good Knowledge of Pyspark . Mush have good Knowledge of Databricks . Must have good experience in AWS (Glue, Athena, Redshift, EMR) Typically requires relevant analysis work and domain-area work experience. Expert in the management, manipulation, and analysis of very large datasets. Superior verbal and written communication skills, ability to convey rigorous mathematical concepts and considerations to non-experts. Good knowledge of scientific programming in scripting languages like Python. Key Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. About UsefulBI: UsefulBI mission is to enable better business decisions through Business Intelligence. We do this by starting with a deep understanding of the business context and business need. Our founding team collectively has 40+ years of experience in the domains we focus on and we staff each engagement with a functional expert who drives the business centricity through the engagement. We are obsessed with being experts in the latest tools and technologies in our space – whether for data visualization, analysis or sourcing. Tableau, Qlik, Spotfire, Hadoop, R, SAS, Matlab etc. are all part of our core skillset. We are equally obsessive about our data science skills – we carefully select and apply the right data science algorithms and techniques to the right problems. We bring a “Full Solution” approach which combines very strong data architecture skills with cutting-edge predictive modeling/neural network capabilities and intuitive visualization concepts to ensure the best dissemination of the intelligence created by the Data Models. These data models are built by using advanced neural networks, predictive modeling, and machine learning concepts to build proactive models than reactive ones. We combine our industry and functional expertise with data, proprietary analytics, and software tools to help organizations get greater clarity in decision making and gain significant long-term performance improvement.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France