Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Power Apps Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Power Platform InfrastructureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in application design and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Apps.- Good To Have Skills: Experience with Microsoft Power Business Intelligence (BI), Microsoft Power Platform Infrastructure.- Strong understanding of application development lifecycle.- Experience in integrating applications with various data sources.- Familiarity with user interface design principles and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Power Apps.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
4.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
Overview We are seeking a self-driven Senior Tableau Engineer with deep expertise in data modeling, visualization design, and BI-tool migrations. Youll own end-to-end dashboard development, translate complex healthcare and enterprise data into actionable insights, and lead migrations from legacy BI platforms (e.g., MicroStrategy, BusinessObjects) to Tableau. Job Location - Delhi NCR / Bangalore /Pune Key Responsibilities Data Modeling & Architecture Design and maintain logical and physical data models optimized for Tableau performance. Collaborate with data engineers to define star/snowflake schemas, data marts, and semantic layers. Ensure data integrity, governance, and lineage across multiple source systems. Visualization Development Develop high-impact, interactive Tableau dashboards and visualizations for executive-level stakeholders. Apply design best practices: color theory, UX principles, and accessibility standards. Optimize workbooks for performance (efficient calculations, extracts, and queries) BI Migration & Modernization Lead migration projects from MicroStrategy, BusinessObjects, or other BI tools to Tableau. Reproduce and enhance legacy reports in Tableau, ensuring feature parity and improved UX. Validate data accuracy post-migration through sampling, reconciliation, and automated testing. Automation & Deployment Automate data extract refreshes, alerting, and workbook publishing via Tableau Server/Online. Implement CI/CD processes for Tableau content using Git, Tableau APIs, and automated testing frameworks. Establish standardized naming conventions, folder structures, and content lifecycle policies. Collaboration & Mentorship Partner with analytics translators, data engineers, and business owners to gather requirements and iterate solutions. Mentor junior BI developers on Tableau best practices, performance tuning, and dashboard design. Evangelize self-service BI adoption: train users, develop documentation, and host office hours. Governance & Quality Define and enforce Tableau governance: security, permissions, version control, and change management. Implement data quality checks and monitoring for dashboards (row counts, anomalies, thresholds). Track and report key metrics on dashboard usage, performance, and user satisfaction.
Posted 1 week ago
10.0 - 15.0 years
12 - 16 Lacs
Hyderabad, Bengaluru
Work from Office
Notice Period : Early joiners A Generative AI Developer specializing in Azure-based chatbot development is responsible for designing, building, and deploying AI-powered conversational agents using Microsoft Azure AI services . They leverage natural language processing (NLP), large language models (LLMs), and cloud-based AI tools to create chatbots that enhance user experience and automate interactions. Key Responsibilities Design and develop AI-powered chatbots using Azure OpenAI, Azure Bot Service, and Cognitive Services. Implement natural language understanding (NLU) and natural language generation (NLG) for intelligent conversation handling. Fine-tune GPT-based models and integrate them with business applications. Optimize chatbot performance by training models, improving responses, and handling edge cases. Deploy and manage chatbots using Azure Machine Learning, Azure Functions, and Kubernetes. Integrate chatbots with databases, APIs, and enterprise applications (e.g., Dynamics 365, Microsoft Teams, or custom apps). Ensure chatbot security, compliance, and scalability in Azure cloud environments. Monitor chatbot analytics using Azure Monitor, Application Insights, and Power BI.
Posted 1 week ago
10.0 - 15.0 years
11 - 15 Lacs
Gurugram
Work from Office
Position Summary: We are seeking a Senior BI Platform Engineer with 10+ years of experience and deep expertise in Tableau, Power BI, Alteryx, and MicroStrategy (MSTR) . The ideal candidate will serve as a technical lead and administrator for our BI platforms, ensuring reliable performance, advanced user support (L3), and stakeholder engagement. This role also includes implementing and maintaining CI/CD pipelines for BI assets to ensure scalable, automated, and governed deployment processes. Key Responsibilities: Serve as platform administrator for Tableau, Power BI, Alteryx, and MSTRmanaging permissions, data sources, server performance, and upgrades. Provide Level 3 (L3) support for BI platforms, resolving complex technical issues, root cause analysis, and platform-level troubleshooting. Design, implement, and maintain CI/CD pipelines for BI dashboards, dataflows, and platform configurations to support agile development and deployment. Collaborate with cross-functional teams to gather requirements and ensure proper implementation of dashboards and analytics solutions. Monitor and optimize BI platform performance, usage, and adoption. Work closely with data engineering teams to ensure data quality and availability for reporting needs. Create and maintain documentation for governance, support processes, and best practices. Train and mentor users and junior team members on BI tools and reporting standards. Act as a liaison between business stakeholders and technical teams, ensuring alignment and timely resolution of issues. Manage all BI upgrades Manage Power BI gateway, Tableau bridge, Alteryx server and other BI platforms capacity optimally Manage and enable new features in each of the BI platforms Manage licenses for each platform automated assignments, off-boarding users off the licensing and manage the licensing optimally Manage RBAC for all the BI platforms Required Qualifications: 10+ years of experience in a BI support or engineering role. Advanced experience with Tableau, Power BI, Alteryx, and MSTR , including administrative functions, troubleshooting, and user support. Proven experience providing L3 support and managing CI/CD pipelines for BI platforms. Strong knowledge of BI architecture, data visualization best practices, and data modelling concepts. Excellent roblem-solving and communication skills, with the ability to interact confidently with senior business leaders. Experience with SQL, data warehouses, and cloud platforms (e.g. Azure, Snowflake) is preferred. Bachelors degree in computer science, Information Systems, or a related field. Preferred Qualifications: Experience with Tableau Server/Cloud, Power BI Service, and MSTR administration. Familiarity with enterprise data governance and access control policies. Certifications in Tableau, Power BI, Alteryx, or MSTR are a plus.
Posted 1 week ago
8.0 - 13.0 years
12 - 16 Lacs
Gurugram
Remote
The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description: Provides Level 3 operational coverage: Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. SKILLS Good hands on Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. Ability to read and write sql and stored procedures. Good hands on experience in configuring, managing and troubleshooting along with general analytical and problem solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. JOB COMPLEXITY: This role requires extensive problem solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. EXPERIENCE/EDUCATION: Requires a Bachelors degree in computer science or other related field plus 8+ years of hands-on experience in configuring and managing AWS/tableau and databricks solution. Experience with Databricks and tableau environment is desired.
Posted 1 week ago
10.0 - 15.0 years
12 - 16 Lacs
Gurugram
Remote
Job Summary The Data engineer is responsible for managing and operating upon Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description Provides Level 3 operational coverage: Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed / Entitlements Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. KNOWLEDGE/SKILLS/ABILITY Good hands-on Tableau, Tableau bridge server, Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I. Ability to read and write sql and stored procedures. Good hands-on experience in configuring, managing and troubleshooting along with general analytical and problem-solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. JOB COMPLEXITY: This role requires extensive problem-solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. EXPERIENCE/EDUCATION Requires a Bachelors degree in computer science or other related field plus 10+ years of hands-on experience in configuring and managing AWS/tableau and databricks solution. Experience with Databricks and tableau environment is desired.
Posted 1 week ago
7.0 - 12.0 years
14 - 18 Lacs
Gurugram, Bengaluru
Work from Office
Summary The Data engineer is responsible for managing and operating upon Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerBI/Tableau . The engineer will work closely with the customer and team to manage and operate cloud data platform. Job Description Leads Level 4 operational coverage: Resolving pipeline issues / Proactive monitoring for sensitive batches / RCA and retrospection of issues and documenting defects. Design, build, test and deploy fixes to non-production environment for Customer testing. Work with Customer to deploy fixes on production upon receiving Customer acceptance of fix. Cost / Performance optimization and Audit / Security including any associated infrastructure changes Troubleshooting incident/problem, includes collecting logs, cross-checking against known issues, investigate common root causes (for example failed batches, infra related items such as connectivity to source, network issues etc.) Knowledge Management: Create/update runbooks as needed . Governance: Watch all the configuration changes to batches and infrastructure (cloud platform) along with mapping it with proper documentation and aligning resources. Communication: Lead and act as a POC for customer from off-site, handling communication, escalation, isolating issues and coordinating with off-site resources while level setting expectation across stakeholders Change Management: Align resources for on-demand changes and coordinate with stakeholders as required Request Management: Handle user requests if the request is not runbook-based create a new KB or update runbook accordingly Incident Management and Problem Management, Root cause Analysis, coming up with preventive measures and recommendations such as enhancing monitoring or systematic changes as needed. Skill Good hands-on Databricks, Dbt, SSRS, SSIS, AWS DWS, AWS APP Flow, PowerB I/ Tableau Ability to read and write sql and stored procedures. Good hands-on experience in configuring, managing and troubleshooting along with general analytical and problem-solving skills. Excellent written and verbal communication skills. Ability to communicate technical info and ideas so others will understand. Ability to successfully work and promote inclusiveness in small groups. Experience/Education Requires a Bachelors degree in computer science or other related field plus 10+ years of hands-on experience in configuring and managing AWS/tableau and databricks solutions. Experience with Databricks and tableau environment is desired. JOb Complexity This role requires extensive problem-solving skills and the ability to research an issue, determine the root cause, and implement the resolution; research of various sources such as databricks/AWS/tableau documentation that may be required to identify and resolve issues. Must have the ability to prioritize issues and multi-task. Work Location - Remote Work From Home Shift: Rotation Shifts (24/7)
Posted 1 week ago
2.0 - 5.0 years
25 - 30 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Role involves leading analytics initiatives, developing business intelligence (BI) solutions & providing actionable insights The position will focus on improving key performance indicators (KPIs) running sales & purchase efficiency programs. Required Candidate profile Proven exp in analytics and insight. Exp in sales & operations, marketing, category & logistics analytics is preferred. Proficiency in data analysis tools like SQL, Python and techniques is add on.
Posted 1 week ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Ahmedabad, Bengaluru
Work from Office
SUMMARY Sr. Data Analytics Engineer Databricks - Power mission-critical decisions with governed insight s Company: Ajmera Infotech Private Limited (AIPL) Location: Ahmedabad, Bangalore /Bengaluru, Hyderabad (On-site) Experience: 5 9 years Position Type: Full-time, Permanent Ajmera Infotech builds planet-scale software for NYSE-listed clients, driving decisions that can’t afford to fail. Our 120-engineer team specializes in highly regulated domains HIPAA, FDA, SOC 2 and delivers production-grade systems that turn data into strategic advantage. Why You’ll Love It End-to-end impact Build full-stack analytics from lake house pipelines to real-time dashboards. Fail-safe engineering TDD, CI/CD, DAX optimization, Unity Catalog, cluster tuning. Modern stack Databricks, PySpark , Delta Lake, Power BI, Airflow. Mentorship culture Lead code reviews, share best practices, grow as a domain expert. Mission-critical context Help enterprises migrate legacy analytics into cloud-native, governed platforms. Compliance-first mindset Work in HIPAA-aligned environments where precision matters. Requirements Key Responsibilities Build scalable pipelines using SQL, PySpark , Delta Live Tables on Databricks. Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. Deliver robust Power BI solutions dashboards, semantic layers, paginated reports, DAX. Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. Document everything from pipeline logic to RLS rules in Git-controlled formats. Collaborate cross-functionally to convert product analytics needs into resilient BI assets. Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills 5+ years in analytics engineering, with 3+ in production Databricks/Spark contexts. Advanced SQL (incl. windowing), expert PySpark , Delta Lake, Unity Catalog. Power BI mastery DAX optimization, security rules, paginated reports. SSRS-to-Power BI migration experience (RDL logic replication) Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). Communication skills to bridge technical and business audiences. Nice-to-Have Skills Databricks Data Engineer Associate cert. Streaming pipeline experience (Kafka, Structured Streaming). dbt , Great Expectations, or similar data quality frameworks. BI diversity experience with Tableau, Looker, or similar platforms. Cost governance familiarity (Power BI Premium capacity, Databricks chargeback). Benefits What We Offer Competitive salary package with performance-based bonuses. Comprehensive health insurance for you and your family.
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
Chennai
Work from Office
About The Role About The Role : Interpret business requirements and translate them into technical specifications. Design, develop, and maintain Qlik Sense dashboards, reports, and data visualizations. Perform data extraction, transformation, and loading (ETL) from various sources. Create and manage QVD files and implement data modeling best practices. Ensure data accuracy and consistency through validation and testing. Optimize Qlik Sense applications for performance and scalability. Collaborate with business analysts, data engineers, and stakeholders. Provide technical support and troubleshoot issues in Qlik Sense applications. Document development processes, data models, and user guides. 4+ years of experience in Qlik Sense development and dashboarding. Strong knowledge of data modeling, set analysis, and scripting in Qlik. Proficiency in SQL and experience with RDBMS like MS SQL Server or Oracle. Familiarity with Qlik Sense integration with web technologies and APIs. Understanding of BI concepts and data warehousing principles. Excellent problem-solving and communication skills. Qlik Sense certification is a plus. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 1 week ago
0.0 - 1.0 years
2 - 6 Lacs
Noida
Work from Office
Data Analyst Job Description: Interpret data, analyze results using statistical techniques, and provide ongoing reports. Develop and maintain dashboards and visualizations using tools like Power BI, Tableau, or similar. Identify, analyze, and interpret trends or patterns in complex data sets. Work closely with management to prioritize business and information needs. Acquire data from primary or secondary sources and maintain databases/data systems. Filter and clean data, and review reports and performance indicators to locate and correct problems. Provide actionable insights and recommendations based on data analysis. Experience Range: 0 - 1 years Educational Qualifications: Any graduation Skills Required: Data Analysis
Posted 1 week ago
2.0 - 7.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Company: Oliver Wyman Description: Oliver Wyman DNA team is now looking to hire a Senior Data Analytics Specialist - we are looking for individuals with strong experience in Power BI and Tableau OW DNA Overview The Oliver Wyman DNA is a distinguished center of excellence for business analytics and data analytics within Oliver Wyman. This group leverages data and information to provide business insights to Oliver Wyman consulting teams driving positive outcomes and tangible impact for Oliver Wymans clients. The group combines cutting edge data science expertise with core consulting expertise to augment engagement teams with analytical firepower to deliver outstanding results. Key Responsibilities: Design and create interactive dashboards to effectively communicate data insights to stakeholders. Gather and extract data from various databases, Excel, and other relevant sources to support BI reporting and analysis. Engage with stakeholders to understand their reporting needs and define key performance indicators (KPIs) and ensure alignment with business objectives. Coordinate with various stakeholders to understand the data structure, collect, clean, transform, validate, and visualize datasets, ensuring the integrity of our data flows Develop data models to identify relationships and trends within the data, enhancing the understanding of business performance. Perform analysis in SQL and Python to derive actionable insights and support data-driven decision-making. Maintain and optimize existing BI reports and dashboards, ensuring accuracy and relevance of the information presented. Manage the design, development, and delivery of self-service data infrastructure Identify and manage dependencies and risks across various analytics datasets and models Education: Bachelors degree in science, Finance, Mathematics, Economics or equivalent. MS or Certificate courses in analytics preferred Experience: Overall experience of 2+ years in business intelligence tools such as Power BI, Tableau. Excellent analytical and problem-solving skills with proven ability to deliver actionable insights and proficiency in data modelling and visualization skills. Proven experience with data automation, creating dashboards in Power BI and Tableau and analytical models. Strong written and verbal communication skills with demonstrated ability to interact effectively with all levels of stakeholders (both internal and external) Fast learner with ability to learn and pick up a new tool/ platform quickly Advanced skills in MS-office, along with familiarity with Gen AI and other analytical tools preferred Experience in data analytics tools such as SQL and python is preferred Willingness to travel as required to meet client needs. Oliver Wyman, a business of Marsh McLennan (NYSEMMC), is a management consulting firm combining deep industry knowledge with specialized expertise to help clients optimize their business, improve operations and accelerate performance. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit oliverwyman.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 1 week ago
4.0 - 9.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Company: Oliver Wyman Description: Oliver Wyman DNA team is now looking to hire a Lead Data Analytics Specialist based in our Hyderabad office. OW DNA Overview The Oliver Wyman DNA is a distinguished center of excellence for business analytics and data analytics within Oliver Wyman. This group leverages data and information to provide business insights to Oliver Wyman consulting teams driving positive outcomes and tangible impact for Oliver Wymans clients. The group combines cutting edge data science expertise with core consulting expertise to augment engagement teams with analytical firepower to deliver outstanding results. Key Responsibilities: Serve as the primary analytical resource for client teams, building trusted relationships with key stakeholders. Collaborate with senior team members to identify risks and growth opportunities, utilizing analytical insights for client benefit. Extract and manipulate data from various sources using SQL and Python for business intelligence reporting. Engage with stakeholders to understand reporting needs and align KPIs with strategic objectives. Coordinate with stakeholders to collect, clean, and validate datasets, ensuring data integrity through coding techniques. Develop and implement data models in SQL and Python to uncover trends and enhance business performance insights. Conduct in-depth analyses with SQL and Python to provide actionable insights for client decision-making. Maintain and optimize BI reports through coding improvements to ensure accuracy and relevance. Stay updated on market trends and best practices in data analytics to enhance service offerings and quality standards. Mentor junior team members in analytical coding practices, fostering a collaborative learning environment. Education: Bachelors degree in science, Finance, Mathematics, Economics or equivalent. MS or Certificate courses in analytics preferred Experience: Over 4+ years of experience in data analytics, focused on SQL and Python coding. Proficient in data analytics tools such as SQL and Python, with a solid track record of managing complex analytical projects Strong analytical and problem-solving skills, delivering actionable insights through data modelling and analysis Excellent client-facing abilities, building relationships with senior stakeholders and influencing decision-makers. Strong written and verbal communication skills, effectively engaging with stakeholders at all levels. Adaptive learner with the ability to rapidly adopt new tools and platforms, including knowledge of Gen AI Entrepreneurial mindset with a collaborative approach to problem-solving and a commitment to mentoring and developing talent. Familiarity with creating visualizations in Power BI and Tableau is a plus. Oliver Wyman, a business of Marsh McLennan (NYSEMMC), is a management consulting firm combining deep industry knowledge with specialized expertise to help clients optimize their business, improve operations and accelerate performance. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit oliverwyman.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Noida
Work from Office
4G RAN OSS KPI optimization 5G RAN OSS KPI optimization Post launch optimization experience Required Candidate profile Accessibility, Retainability KPI analysis & improvement experience Load balancing, Throughput improvement experience etc. Ericsson vendor experience is added advantage. Tools: NetAn, BI etc
Posted 1 week ago
4.0 - 8.0 years
5 - 7 Lacs
Coimbatore
Work from Office
MIS Executive (US Healthcare Process) Position: MIS Executive (US Healthcare Process) -Must have experience in RCM business knowledge, must have good knowledge in MS Excel. Experience: 1 to 3 years Mode: Work from Office Notice Period: Immediate Location: Coimbatore Role & responsibilities Candidate should have RCM business knowledge along with MIS skillset, Excel knowledge is must. Should be able to front end discussion with internal teams. Provide analytical and strategical support. Good Analytical skills for data analysis and generation of reports. Tracks all KPI's and SLA's set by the clients with strict adherence to Quality parameters. Reconciliation of data and analysis. Ensure timeline/accuracy of Daily/Monthly/Quarterly reports. To provide data for all reviews pertain to operations. Preferred candidate profile Strong Written and verbal communication skills. Strong on domain knowledge. Ability to build and maintain strong working relationships. Self-Driven and assertive.
Posted 1 week ago
2.0 - 4.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Summary Oracle ERP VBCS Developer is responsible for making development modifications to the firms Oracle Cloud ERP system. In this role, the ERP VBCS Developer is charged with analyzing internal user needs to accurately design, construct, and maintain the Oracle Cloud ERP system to meet the individual needs of the user. Job Duties Administers the day-to-day functions of the Oracle Cloud ERP system. Designs new and modifies existing modules/components based on requirements. Provides Level IV support for Oracle Cloud ERP system issues and questions. Provides Level IV support for specific applications, as needed. Devises strategic solutions to resolve issues while keeping the system operational. Partners with other members of the Applications Services team, as well as other resources within National IT, Human Resources and/or Finance to resolve issues. Consults with other Application Services team members to analysis diverse issues. Participates in weekly support meetings with hosting provider. Participates in design meetings as appropriate. Serves as the Secondary Support person for after-hours support, as needed. Other duties as required. Supervisory Responsibilities: N/A Qualifications, Knowledge, Skills and Abilities Education: Bachelors or masters degree in computer science, Information Systems or equivalent field required. Experience: Three (3) or more years of experience with development/enhancements or support of Oracle Cloud ERP or similar ERP modules required. One (1) or more years of experience working with Visual Builder Cloud Service (VBCS). A minimum of One (1) year of experience with support of cloud based (SaaS) ERP applications required. Experience in developing custom web applications using Visual Builder Cloud Service (VBCS) and integrating web services is required. Prior experience in redwood customizations preferred. Experience in JavaScript, HTML, CSS preferred. Prior experience in VB Studio git and CI/CD process preferred. Prior experience working with Oracle Integration Cloud (OIC) preferred. Prior experience working with BI Publisher preferred. Prior experience in working with database using SQL, PL/SQL preferred. Intermediate level understanding of Oracle Cloud ERP FSCM modules preferred. License/Certifications: Oracle Redwood certification preferred. Software: Experience with the following products, required: Oracle Fusion FSCM modules Visual Builder Cloud Service (VBCS) REST, SOAP Experience with the following products, preferred: DevOps Java Script, React, Node.js Other Knowledge, Skills & Abilities: Strong oral and written communication skills Excellent interpersonal and customer relationship skills Capacity to work in a deadline-driven environment while handle multiple complex projects/tasks simultaneously with a focus on details. Capable of successfully multi-task while working independently or within a group environment Ability to rely on extensive experience and judgment to plan and accomplish goals. Capable of working well under pressure while dealing with unexpected problems in a professional manner Capacity to communicate and interact with all levels of employees and management. Ability to interact and build relationships and consensus among people. Advanced knowledge of database optimization efforts such as hint, statistics, and other related experience, preferred. Capacity to consistently produce clean coding and adhere to appropriate documentation standards.
Posted 1 week ago
4.0 - 8.0 years
12 - 20 Lacs
Gurugram
Work from Office
IA-Consultant-Data Engineer SSIS ADF : Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in deliveringinnovative and sustainable solutions to a diverse range of clients, includingover 30% of Fortune 500 companies. With a presence in more than 45 countriesacross five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevateour clients' business impact and strategic decision-making. Our team of over4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emergingmarkets like Colombia, the Middle East, and the rest of Asia-Pacific.Recognized by Great Place to Work in India, Chile, Romania, the US, andthe UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-basedculture that prioritizes continuous learning and skill development, work-lifebalance, and equal opportunity for all. About Insights & Advisory We are a global professional services provider offering research, analytics, and business process support services enabled by our innovative 'mind + machine' approach. We are working with over 300+ Fortune 1000 companies. Our TMT team, cater to 4 of the top 5 global Telecom & Networking Infrastructure companies as well as biggest public cloud providers. About this role We are looking for a skilled and motivated Data Engineer/Analyst with 45 years of experience in data engineering, particularly in migrating on-premises systems to cloud-based environments. This role requires strong expertise in SQL Server, SSIS, Azure Data Factory (ADF), Power BI and Microsoft Fabric. The ideal candidate will have hands-on experience designing, developing, and deploying scalable data solutions in Azure, ensuring seamless data integration and high performance. What you will be doing at Evalueserve Lead and execute the migration of on-premises SQL Server databases to Azure SQL. Migrate and modernize legacy SSIS packages from the file system to Azure Data Factory pipelines. Manage end-to-end Microsoft Fabric migration projects, including planning, execution, and post-migration validation. Design and develop stored procedures, SSIS packages, and ADF pipelines to support business data needs. Collaborate with cross-functional teams to understand requirements and deliver scalable, production-ready data solutions. Ensure data quality, workflow optimization, and performance tuning across all stages of data processing What we are looking for 45 years of hands-on experience in data engineering. Proven expertise in SQL Server (on-premises and Azure SQL). Strong experience in SSIS package development and migration. Proficiency in Azure Data Factory (ADF) and cloud-based data integration. Experience with Microsoft Fabric migration and implementation. Proficient with Power BI and Symantec Data Models, measure and views. Solid knowledge of T-SQL, stored procedures, and query optimization. Preferred Qualifications Relevant Microsoft certifications (e.g., Azure Data Engineer Associate) are a plus. Experience with DevOps practices for data pipelines. Strong communication and collaboration skills. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Know more about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: Thefollowing job description serves as an informative reference for the tasks youmay be required to perform. However, it does not constitute an integralcomponent of your employment agreement and is subject to periodic modificationsto align with evolving circumstances. Please Note :We appreciate the accuracy and authenticity of the information you provide, asit plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure allinformation is factual and submitted on time. For any assistance, your TA SPOCis available to support you .
Posted 1 week ago
12.0 - 15.0 years
55 - 60 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Dear Candidate, We are hiring a BI Developer to transform raw data into meaningful insights through dashboards and reports that guide business decisions. Key Responsibilities: Develop and maintain BI dashboards and visualizations. Build data models and define KPIs for reporting. Extract, clean, and transform data from multiple sources. Optimize data queries for speed and accuracy. Work with stakeholders to define business metrics and reporting needs. Required Skills & Qualifications: Expertise in BI tools (Power BI, Tableau, Qlik). Proficiency in SQL and data modeling techniques. Experience with ETL development and data warehousing. Understanding of business processes (finance, sales, operations). Strong analytical and communication skills. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 1 week ago
3.0 - 6.0 years
4 - 8 Lacs
Pune
Work from Office
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your Role Should have 5+ years of experience in Informatica PowerCenter Strong knowledge of ETL concepts, Data Warehouse architecture and best practices Should be well versed with different file Formats for parsing files like Flat File, XML, JSON, and various source systems for integration. Must have hands-on development as individual contributor in at least 2 Project Lifecycles (Data Warehouse/Data Mart/Data Migration) with client facing environment Your Profile Design, develop, Unit Testing, Deployment, Support Data applications and Infrastructure utilizing various technologies to process large volumes of data. Strong Technical and Functional understanding of RDBMS DWH-BI knowledge. Should have implemented- error handling, exception handling and Audit balance control framework. Good knowledge either Unix/Shell or Python scripting and scheduling tool Strong SQL, PL/SQL Skills, data analytics and performance tuning capabilities. Good to have knowledge of Cloud platform and technologies What You'll love about working here We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions are crafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql. Skills (competencies)
Posted 1 week ago
5.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
About The Role Seeking a skilled and detail-oriented OAS/OBIEE Consultant to join our data and analytics team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence (BI) and dashboarding solutions to support smelter operations and decision-making processes. You will work closely with cross-functional teams to transform raw data into actionable insights using modern BI tools and ETL processes. Key Responsibilities: Develop and maintain interactive dashboards and reports using Microsoft Power BI and Oracle Analytics . Design and implement ETL processes using Oracle Data Integrator and other tools to ensure efficient data integration and transformation. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data analysis and validation to ensure data accuracy and consistency across systems. Optimize queries and data models for performance and scalability. Maintain and support Oracle Database and other RDBMS platforms used in analytics workflows. Ensure data governance, quality, and security standards are met. Provide technical documentation and user training as needed. Required Skills and Qualifications: Proven experience in BI solutions , data analysis , and dashboard development . Strong hands-on experience with Microsoft Power BI , Oracle Analytics , and Oracle Data Integrator . Proficiency in Oracle Database , SQL , and relational database concepts. Solid understanding of ETL processes , data management , and data processing . Familiarity with business intelligence and business analytics best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the smelting or manufacturing industry is a plus. Knowledge of scripting languages (e.g., Python, Shell) for automation. Certification in Power BI, Oracle Analytics, or related technologies.
Posted 1 week ago
5.0 - 8.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Primary Skill Provide technical support and maintenance for the Temenos Data Hub & Temenos Analytics platforms.Monitor system performance and troubleshoot issues to ensure high availability and reliability.Collaborate with cross-functional teams to resolve technical problems and implement solutions.Perform regular system updates, patches, and upgrades to maintain system integrity and security.Develop and maintain documentation for system configurations, processes, and procedures.Assist in the integration of the TDH platform with other banking systems and third-party applications.Ensure compliance with banking industry standards and regulatory reporting requirements.Provide training and support to end-users and other technical staff.Stay up-to-date with the latest developments and best practices in Temenos BI platform. Secondary Skill 5 to 8 years of experience as a support engineer for the Temenos Data Hub & Temenos Analytics platforms.Strong understanding on DES, ODS, SDS & ADS data flows.Experience in ADS Data mapping activity against the clients report requirements.Experience in Azure Kubernetes Services, SQL Server, Azure cloud environment.Strong understanding of banking operations and Reporting from Analytics & TDH Designer & Scheduler.Proficiency in troubleshooting and resolving technical issues related to the TDH platform.Experience with system monitoring tools and techniques.Knowledge of integration methods and technologies, including APIs and middleware.Strong knowledge in database management and SQL.Strong problem-solving skills and the ability to work independently and as part of a team.Excellent communication skills and the ability to collaborate effectively with team members and stakeholders. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Surat
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Varanasi
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Visakhapatnam
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.
Posted 1 week ago
6.0 - 10.0 years
16 - 25 Lacs
Lucknow
Work from Office
Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough