Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be part of an inclusive, adaptable, and forward-thinking organization at NTT DATA as a SQL Developer in Bangalore, Karntaka (IN-KA), India (IN). Your responsibilities will include developing complex queries with strong SQL knowledge, utilizing ETL Tools for data extraction, and performing data cleansing, validation, and pre-processing for structured and unstructured data. You will be involved in creating insurance reports, visualizations, dashboards, analysis, and analytics, particularly focusing on Life Insurance. Proficiency in tools such as SQL, EXL, R, and Python will be essential for this role. NTT DATA is a $30 billion global innovator in business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer, NTT DATA has diverse experts in over 50 countries and a partner ecosystem including established and start-up companies. The services provided by NTT DATA encompass business and technology consulting, data and artificial intelligence, industry solutions, and the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a major provider of digital and AI infrastructure worldwide and is part of the NTT Group, which invests over $3.6 billion annually in R&D to support organizations and society in confidently transitioning into the digital future. Visit NTT DATA at us.nttdata.com for more information.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Data & Analytics Team is looking for a Data Engineer with a versatile skill set encompassing data integration and application development. As a Data Engineer, your role will be pivotal in the design, engineering, governance, and enhancement of our comprehensive Data Platform. This platform caters to customers, partners, and employees by providing self-service access. You will be expected to showcase proficiency in various areas including data & metadata management, data integration, data warehousing, data quality, machine learning, and fundamental engineering principles. With a minimum of 5 years of experience in system/data integration, software development, or implementation of enterprise and/or cloud software, you will play a key role in leading system/data integration efforts. Your responsibilities will include designing and implementing data warehousing solutions and associated pipelines, performing extensive data wrangling, authoring complex queries in SQL and NoSQL environments, and developing and integrating applications using Python and Web APIs (RESTful and SOAP). You will be required to provide operational support for the data platform and applications, including incident management. Additionally, you should excel in creating comprehensive Business Requirement, Functional, and Technical documentation, developing Unit & Functional Test Cases, Test Scripts, and Run Books, and effectively managing incidents using systems like Jira and Service Now. Collaboration within cross-functional teams and adherence to Agile Software Development methodology will be essential for success in this role. At GlobalLogic, we prioritize a culture of caring where people come first. You will experience an inclusive culture of acceptance and belonging, enabling you to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities, interesting and impactful work, balance, flexibility, and a high-trust environment are some of the key benefits of being a part of GlobalLogic, a Hitachi Group Company. Since 2000, GlobalLogic has been a trusted digital engineering partner, collaborating with clients to create innovative digital products and experiences that redefine industries and transform businesses.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Senior Technical Support Specialist at PTC, you will play a crucial role as a technical advisor and escalation point within the Servigistics Support organization. Leveraging your extensive industry experience, you will drive strategic customer success, provide guidance to junior team members, and lead efforts to troubleshoot complex issues. Collaborating with various teams, including engineering, product management, and customer teams, you will ensure seamless technical support delivery. Your key responsibilities will include: - Acting as the primary technical contact for high-priority and complex customer escalations. - Leading the resolution of critical issues related to product functionality, performance, and deployment. - Partnering with global cross-functional teams to facilitate timely resolution of customer challenges. - Proactively identifying and implementing improvements in support processes and product usability. - Contributing to and reviewing knowledge articles aligned with KCS principles to enhance customer self-service. - Working closely with product and engineering teams to influence the product roadmap based on customer feedback. - Mentoring and guiding junior technical support engineers, offering coaching and sharing best practices. - Representing support in customer meetings, escalations, and business reviews. - Ensuring high SLA compliance for enterprise customers with complex environments. - Being available for 24x7 rotational shifts and supporting weekend shifts as needed to meet global support requirements. To excel in this role, you should possess the following skills and competencies: - Proficiency in diagnosing and resolving enterprise-grade application issues across web, application, and database layers. - Expertise in SQL (Oracle and SQL Server) and the ability to write and optimize complex queries. - Hands-on experience with ETL tools and resolving batch job failures. - Sound knowledge of open-source web technologies such as Apache Tomcat and Apache Web Server. - Experience in performance tuning, server configuration, log analysis, and application scalability. - Understanding of Java-based enterprise applications and their lifecycle. - Familiarity with enterprise IT environments, including networks, security protocols, and integrations. - Demonstrated ability to work independently under pressure while managing multiple complex issues. Preferred qualifications for this role include experience with UNIX/Linux environments, cloud platforms like AWS, exposure to machine learning concepts, and a degree in Computer Science, Engineering, or a related field. Additionally, excellent communication skills and the ability to interact confidently with senior stakeholders are essential. Joining PTC offers you the opportunity to work with innovative products, collaborate with talented global teams, and be part of a culture where your voice matters. You can enjoy extensive benefits, including best-in-class insurance, stock purchase plans, generous leave policies, flexible work hours, and numerous career growth opportunities. At PTC, we value problem-solving through innovation and are committed to creating a transformative work environment. If you are passionate about driving change and innovation, we invite you to explore your next career move with us.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The role of Data Lead at LERA Technologies involves owning the data strategy, architecture, and engineering roadmap for key client engagements. As a Data Lead, you will lead the design and development of scalable, secure, and high-performance data pipelines, marts, and warehouses. Additionally, you will mentor a team of data engineers and collaborate with BI/reporting teams and solution architects. Your responsibilities will include overseeing data ingestion, transformation, consolidation, and validation across cloud and hybrid environments. It is essential to champion best practices for data quality, data lineage, and metadata management. You will also be expected to evaluate emerging tools, technologies, and frameworks to enhance platform capabilities and engage with business and technical stakeholders to translate analytics needs into scalable data solutions. Monitoring performance and optimizing storage and processing layers for efficiency and scalability are key aspects of this role. The ideal candidate for this position should have at least 7 years of experience in Data Engineering, including proficiency in SQL/PLSQL/TSQL, ETL development, and data pipeline architecture. A strong command of ETL tools such as SSIS or equivalent and Data Warehousing concepts is required. Expertise in data modeling, architecture, and integration frameworks is essential, along with experience leading data teams and managing end-to-end data delivery across projects. Hands-on knowledge of BI tools like Power BI, Tableau, SAP BO, or OBIEE and their backend integration is a must. Proficiency in big data technologies and cloud platforms such as Azure, AWS, or GCP is also necessary. Programming experience in Python, Java, or equivalent languages, as well as proven experience in performance tuning and optimization of large datasets, are important qualifications. A strong understanding of data governance, data security, and compliance best practices is required, along with excellent communication, stakeholder management, and team mentoring abilities. Desirable skills for this role include leadership experience in building and managing high-performing data teams, exposure to data mesh, data lake house architectures, or modern data platforms, experience defining and enforcing data quality and lifecycle management practices, and familiarity with CI/CD for data pipelines and infrastructure-as-code. At LERA Technologies, you will have the opportunity to embrace innovation, creativity, and experimentation while significantly impacting our clients" success across various industries. You will thrive in a workplace that values diversity and inclusive excellence, benefit from extensive opportunities for career advancement, and lead cutting-edge projects with an agile and visionary team. If you are ready to lead data-driven transformation and shape the future of enterprise data, apply now to join LERA Technologies as a Data Lead.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The successful candidate will play a key role in designing and implementing data governance solutions using Microsoft Purview. You will work closely with cross-functional teams to establish best practices for data management, ensure compliance, and enhance data visibility across the organization. Lead the architecture and design of data governance solutions using Microsoft Purview. Collaborate with data engineering, data science, and business teams to define data governance policies and standards. Implement and manage data classification, lineage, and cataloging processes in Microsoft Purview. Develop strategies for data quality, data security, and compliance with regulations such as GDPR and CCPA. Conduct training sessions and workshops to educate teams on data governance best practices and the use of Microsoft Purview tools. Monitor and optimize the performance of data governance solutions, ensuring they meet the organization's needs. Provide technical leadership and mentorship to junior team members. Extensive experience with Microsoft Azure and Microsoft Purview. Strong knowledge of data governance principles, frameworks, and best practices. Proficiency in SQL and data modeling concepts. Experience with data visualization tools such as Power BI or Tableau. Familiarity with data privacy regulations and compliance requirements. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills to work effectively with stakeholders at all levels. Knowledge of cloud architecture and data architecture principles. Experience with ETL tools and data transformation processes is a plus. Ability to work in a fast-paced, collaborative environment. If you are a passionate technical architect with a deep understanding of Microsoft Purview and a commitment to driving data governance excellence, we invite you to apply and join our dynamic team at Tavant Technologies.,
Posted 2 months ago
2.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
You are a highly skilled and experienced professional tasked with leading and supporting data warehousing and data center architecture initiatives. Your expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance, along with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake, will be crucial for this role. You are expected to have a strong presales experience, technical leadership capabilities, and the ability to manage complex enterprise deals across various geographies. Your main responsibilities will include architecting and designing scalable Data Warehousing and Data Lake solutions, leading presales engagements, creating and presenting proposals and solution designs to clients, collaborating with cross-functional teams, estimating efforts and resources for customer requirements, driving Managed Services opportunities and enterprise deal closures, engaging with clients globally, ensuring alignment of solutions with business goals and technical requirements, and maintaining high standards of documentation and presentation for client-facing materials. To excel in this role, you must possess a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Certifications in AWS, Azure, GCP, or Snowflake are advantageous. You should have experience working in consulting or system integrator environments, a strong understanding of Data Warehousing, Data Lakes, Data Integration, and Data Governance, hands-on experience with ETL tools, exposure to cloud environments, a minimum of 2 years of presales experience, experience in enterprise-level deals and Managed Services, the ability to handle multi-geo engagements, excellent presentation and communication skills, and a solid grasp of effort estimation techniques for customer requirements.,
Posted 2 months ago
9.0 - 12.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Primarily looking for a candidate with strong expertise in data-related skills, including : - SQL & Database Management : Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake. - ETL/ELT Tools : Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines - Data Modeling & Optimization : Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. - Cloud & Security : Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). - Data Warehousing : Experience managing large datasets, data marts, and optimizing databases for performance. - Agile & CI/CD : Knowledge of Agile methodologies and CI/CD automation tools. Imp : The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems Responsibilities : - Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. - Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. - Work with data and analytics experts to strive for greater functionality in our data systems. - Assemble large, complex data sets that meet functional / non-functional business requirements. - Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. - Unit Test databases and perform bug fixes. - Develop best practices for database design and development activities. - Take on technical leadership responsibilities of database projects across various scrum teams. - Manage exploratory data analysis to support dashboard development (desirable) Required Skills : - Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). - Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. - Understanding of data modelling techniques and working knowledge with OLAP systems - Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. - In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. - Ability to fine tune report generating queries. - Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. - Understanding of index design and performance-tuning techniques - Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions - Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). - Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions - Exposure to Source control like GIT, Azure DevOps - Understanding of Agile methodologies (Scrum, Itanban)
Posted 2 months ago
3.0 - 8.0 years
9 - 14 Lacs
Noida
Remote
Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
nagpur, maharashtra
On-site
The Data & Analytics Team at GlobalLogic is looking for a skilled Data Engineer with expertise in data integration and application development. In this role, you will play a crucial part in designing, engineering, governing, and enhancing the entire Data Platform to provide self-service access to customers, partners, and employees. Your responsibilities will include demonstrating proficiency in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles. Requirements: - Minimum 5 years of experience in system/data integration, development, or implementation of enterprise and/or cloud software. - Strong experience with Web APIs such as RESTful and SOAP. - Proficiency in setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). - Demonstrated expertise in Python. - Strong experience in data wrangling and query authoring in SQL and NoSQL environments for structured and unstructured data. - Experience in a cloud-based computing environment, specifically GCP. - Expertise in documenting Business Requirement, Functional & Technical documentation. - Proficiency in writing Unit & Functional Test Cases, Test Scripts & Run books. - Experience with incident management systems like Jira, Service Now, etc. - Working knowledge of Agile Software development methodology. - Strong organizational and troubleshooting skills with attention to detail. - Analytical ability, judgment, and problem analysis techniques. - Excellent interpersonal skills and ability to work effectively in a cross-functional team. Responsibilities: - Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. - Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. - Perform data wrangling and author complex queries in SQL and NoSQL environments for structured and unstructured data. - Develop and integrate applications using Python and Web APIs (RESTful and SOAP). - Provide operational support for the data platform and applications, including incident management. - Create comprehensive Business Requirement, Functional, and Technical documentation. - Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. - Manage incidents effectively using systems like Jira, Service Now, etc. - Prepare change management packages and implementation plans for migrations across different environments. - Actively participate in Enterprise Risk Management Processes. - Work within an Agile Software Development methodology, contributing to team success. - Collaborate effectively within cross-functional teams. GlobalLogic offers: - A culture of caring that prioritizes people and fosters an inclusive environment. - Continuous learning and development opportunities to help you grow personally and professionally. - Interesting and meaningful work on impactful projects that shape the world. - Balance and flexibility in work arrangements to achieve a healthy work-life balance. - A high-trust organization where integrity is valued and upheld. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. Collaborating with forward-thinking companies, GlobalLogic helps transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
We are looking for a Salesforce Data Integration Lead to be a part of our team in Bengaluru, India. As a Salesforce Data Integration Lead, you will play a crucial role in driving our data integration initiatives. Your responsibilities will revolve around designing and implementing data integration strategies for Salesforce, collaborating with cross-functional teams to enhance data processes, and ensuring seamless data flow across various systems. Your primary tasks will include monitoring, validating, and optimizing data quality throughout the integration lifecycle, utilizing ETL tools for effective management of large datasets, and developing documentation for data flows and integration processes. You will also be responsible for providing training and support to team members on best practices in data management. The ideal candidate should have proven experience in Salesforce integrations and API management, a strong grasp of database concepts and ETL processes, excellent problem-solving skills with keen attention to detail, and the ability to communicate complex technical information clearly. Experience with middleware tools would be an added advantage. If you are ready to make a significant impact and grow with us, apply now and be a part of our dynamic team!,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself and a better working world for all. The objective of our process mining practice is to support clients in building a process mining capability offering process mining solutions. As a solution, we perform data extraction, transformations, develop analyses, and derive business cases. You would be expected to develop data-driven process insights and actions and implement the newest features and functionalities of the Celonis software, such as Process Automation, Task Mining, and Machine Learning. We as a team accelerate our customers" digital transformation and drive our process mining capability expansion by closely working with our customers to generate high-value use cases. **Your Key Responsibilities** - Understand the Process Mining solution offered by Celonis and its existing capabilities. - Own and drive the product development for Process Mining by developing relevant assets and offering for the team. - Define the product roadmap, business requirements, measures of success, and features for your products and services, and help executives to deliver these to market. - Extract and create the Transformations on the Client data. - Build customize the Data Model based on client business process. - Capable of building KPIs to highlight the use-case specific to processes and client requirements. - Build the Analysis according to the use-case Implement the Next best action for the process improvements. **Discover** - Play a key role in the Celonis implementation project to ensure the optimal solution to tackle the customer's pain points. - Design innovative analyses and execution apps and enrich them with Machine Learning algorithms or Task Mining to make the customer's processes transparent. - Use Celonis technology to identify process inefficiencies and understand the root causes, always in close collaboration with the customer. **Enhance** - Conduct value creation workshops and align measures to improve process inefficiencies. - Quantify the business and financial potential and present the findings to the management. - Implement our Process Automation technology to speed up the customer's processes, to drive value, and to improve the process conformance rate. **Monitor** - Implement the most relevant KPIs measuring the customer's success. - Ensure the enablement of the customer to continuously improve processes. - Set the foundation of the path to value to make the long-term customer success journey happen. **Skills And Attributes For Success** - You should have experience and knowledge about the Celonis and its various capabilities. - Demonstrate excellent project management skills, inspire teamwork and responsibility with engagement team members, and use current technology/tools to enhance the effectiveness of deliverables and services. - Actively establish client (process owner/functional heads) and internal relationships. - Good communication skills and the ability to conduct meetings, seminars, and presentations. - Leadership and ability to work in a cross-functional or departmental team. In short, you should be a team player. - Understand EY and its service lines and actively assess what the firm can deliver to serve clients. **To qualify for the role you must have** - **Senior Consultant:** A minimum of 4-6 years of Celonis process mining experience along with experience in IT-Consulting, Management Consulting, Process Improvement, or a similar area. - **Consultant:** A minimum of 2-3 years of similar experience in Celonis process mining. - Min 2 yrs of exp in Data Analytics and Process Mining with good knowledge of various tools available in the market for Process Mining. - Major ERPs knowledge such as SAP, Oracle, RPA Platforms, and/or AI-based solutions. - Experience working with complex ERP environments. - Must have process understanding P2P, OTC, RTR, HTR, etc. - Must have dashboarding experience. - Experience in Data extraction, Data model setup and config Knowledge in Process Mining capability/Data Analytics/Data Mining Experience in any ETL tool Informatica, Talend, DataStage or Reporting Tool-Tableau, Qlikview, Microstrategy. - Strong communication skills and enjoy interacting with various customers. - Understanding and are able to interpret business processes. - Excellent analytical skills, are always well-organized and known for being a quick learner. - Basic knowledge of SQL or other programming languages (Python, R, Matlab.). - You are very dedicated and visionary and want to actively drive the Celonis Process Mining technology forward. - Willing to learn implement technologies to enhance/Augment process mining. - You search for a job with a steep learning curve in order to think outside the box and to continuously broaden your knowledge. - You have very good English skills, other languages are an advantage. **Ideally, you'll also have** - Good communication and presentation skills. **What We Look For** We're looking for passionate leaders with a strong vision and a desire to stay on top of trends in the BPM industry and offering solutions through leading tools like Celonis. If you have a genuine passion for helping businesses achieve their full potential, this role is for you. **What Working At EY Offers** EY is committed to being an inclusive employer, and we are happy to consider flexible working arrangements. We strive to achieve the right balance for our people, enabling us to deliver excellent client service while allowing you to build your career without sacrificing your personal priorities. While our client-facing professionals can be required to travel, and at times be based at client sites, our flexible working arrangements can help you achieve a lifestyle balance. In addition, EY offers the following: - Continuous learning: You'll develop the mindset and skills to navigate whatever comes next. - Success as defined by you: We'll provide the tools and flexibility so you can make a meaningful impact, your way. - Transformative leadership: We'll give you the insights, coaching, and confidence to be the leader the world needs. - Diverse and inclusive culture: You'll be embraced for who you are and empowered to use your voice to help others find theirs. The Exceptional EY Experience. It's Yours To Build. EY is equally committed to being an inclusive employer, and we strive to achieve the right balance for our people - enabling us to deliver excellent client service while allowing our people to build their career as well as focus on their wellbeing. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now.,
Posted 2 months ago
3.0 - 8.0 years
9 - 14 Lacs
Ahmedabad
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Data & Analytics Team is looking for a Data Engineer with a unique blend of skills in data integration and application development. In this role, you will play a critical part in the design, engineering, governance, and enhancement of our entire Data Platform. This platform caters to customers, partners, and employees by providing self-service access. Your expertise will be showcased in areas such as data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles. With over 5 years of experience in system/data integration, development, or implementation of enterprise and/or cloud software, you bring a strong background in Web APIs (RESTful and SOAP). Your proficiency extends to setting up data warehousing solutions and associated pipelines, particularly with ETL tools such as Informatica Cloud. Proficiency in Python, data wrangling, and query authoring in both SQL and NoSQL environments is a must. Experience in a cloud-based computing environment, especially GCP, is preferred. You excel in documenting Business Requirement, Functional & Technical documentation, as well as writing Unit & Functional Test Cases, Test Scripts & Run books. Incident management systems like Jira, Service Now, etc., are familiar territories for you. Moreover, you are well-versed in Agile Software development methodology, possess strong organizational and troubleshooting skills, and exhibit excellent interpersonal skills to collaborate effectively within cross-functional teams. As a Data Engineer, you will lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. Your responsibilities will include designing and implementing data warehousing solutions and associated pipelines, performing data wrangling and authoring complex queries, developing and integrating applications using Python and Web APIs, providing operational support for the data platform and applications, creating comprehensive documentation, managing incidents effectively, preparing change management packages, and actively participating in Enterprise Risk Management Processes. Additionally, you will work within an Agile Software Development methodology and contribute to team success while collaborating effectively within cross-functional teams. At GlobalLogic, we offer a culture of caring that prioritizes putting people first, a commitment to continuous learning and development, the opportunity to work on interesting and impactful projects, a belief in the importance of work-life balance and flexibility, and a high-trust organization that values integrity. Join us at GlobalLogic, a trusted digital engineering partner to the world's largest and most forward-thinking companies, where you will have the chance to work on cutting-edge solutions that shape the world today.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You should have a B.Tech/B.E/MSc/MCA qualification along with a minimum of 10 years of experience. As a Software Architect - Cloud, your responsibilities will include architecting and implementing AI-driven Cloud/SaaS offerings. You will be required to research and design new frameworks and features for various products, ensuring they meet high-quality standards and are designed for scale, resiliency, and efficiency. Additionally, motivating and assisting lead and senior developers for their professional and technical growth, contributing to academic outreach programs, and participating in company branding activities will be part of your role. To qualify for this position, you must have experience in designing and delivering widely used enterprise-class SaaS applications, preferably in Marketing technologies. Knowledge of cloud computing infrastructure, AWS Certification, hands-on experience in scalable distributed systems, AI/ML technologies, big data technologies, in-memory databases, caching systems, ETL tools, containerization solutions like Kubernetes, large-scale RDBMS deployments, SQL optimization, Agile and Scrum development processes, Java, Spring technologies, Git, and DevOps practices are essential requirements for this role.,
Posted 2 months ago
11.0 - 15.0 years
0 - 0 Lacs
karnataka
On-site
We are seeking an experienced candidate with a background as a Solutions Architect specifically for Salesforce Platform implementations. The role of CRM-SALESFORCE- Salesforce Solutions Architect requires a minimum of 11 to 14 years of experience, with a corresponding CTC range of 40 - 42 LPA. The position is based in multiple locations in India, with the expectation of working from the office for 5 days a week. In this role, you will be responsible for leading the design and architecture of end-to-end solutions based on business requirements and technical specifications. This includes creating solution blueprints, architecture diagrams, data modeling, process automation, system integrations, and technical documentation. It is crucial to ensure that the solutions you design are scalable, secure, and seamlessly integrate with existing systems and platforms. Your expertise in Salesforce features, encompassing Sales Cloud, Service Cloud, Marketing Cloud, Experience Cloud, and Platform capabilities, will be essential. You will also be tasked with evaluating new Salesforce features and AppExchange products to assess their suitability for the organization. Collaboration with stakeholders is key to aligning solution architecture with business objectives and technology strategies. Providing guidance on adopting emerging technologies and best practices to ensure future-proof solutions is part of the technology strategy aspect of this role. Defining architectural standards, aligning them with organizational goals, and translating business requirements into technical solutions through collaboration with business stakeholders are additional responsibilities. You will work closely with developers, business analysts, and project managers to ensure successful implementation of solutions. Leading technical discussions, architecture reviews, and providing mentorship and leadership to development teams throughout the project lifecycle are also integral components of the position. Overseeing system integration, performance monitoring, ensuring data integrity, and designing strategies for system scalability, reliability, and disaster recovery are critical responsibilities. Documentation, governance, risk management, security, and training and enablement aspects are also part of the job scope. The ideal candidate will possess over 10 years of experience in software development, solution design, or similar roles, with expertise in Salesforce Sales Cloud, Service Cloud, Experience Cloud, and Platform tools. Proficiency in programming languages, databases, Apex, Visualforce, Lightning Web Components (LWC), Salesforce APIs, integration tools, data modeling, and excellent problem-solving skills are essential. Strong communication, collaboration skills, fluency in Spanish and English, and Salesforce certifications are highly desirable. If this opportunity aligns with your expertise and aspirations, please reach out to Ankur Sharma, Sr. HR - Talent Acquisition, by sharing your resume via email at ankur.sharma@ubconsulting.in or contacting him at 9001513258.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
dehradun, uttarakhand
On-site
As a Salesforce Technical Lead at Cynoteck, you will be responsible for leading the end-to-end technical design, architecture, and implementation of Salesforce solutions. Your role will involve collaborating with functional teams to understand business requirements and translating them into scalable and maintainable Salesforce solutions. You will provide technical leadership and mentorship to Salesforce developers, guiding them through best practices and development challenges. Additionally, you will design and implement custom Salesforce applications, including complex workflows, process builders, flows, triggers, and integrations with third-party systems while ensuring adherence to Salesforce development standards and best practices. In this position, you will lead Salesforce system upgrades, patches, and new feature releases to ensure minimal disruption to operations. You will also be responsible for managing data migration and integration strategies, including integration with other internal and external systems. Your role will involve overseeing testing strategies and ensuring that all deliverables meet the required quality standards. It is essential to stay current with Salesforce updates, new features, and industry best practices and evaluate their relevance to the business. To qualify for this role, you need to have the ability to work in a fast-paced, agile environment with a Bachelor's degree in Computer Science, Information Technology, or a related field. You should have 5+ years of experience working with Salesforce, including hands-on development experience in the Salesforce platform. Strong understanding of Salesforce architecture, data model, and capabilities is required. Excellent problem-solving, analytical, and troubleshooting skills are essential. Experience leading and mentoring development teams is preferred. Expertise in Salesforce development tools such as Apex, Visualforce, Lightning Web Components (LWC), SOQL, and Salesforce APIs is a must. Experience with Salesforce integrations using REST/SOAP APIs, Middleware, or ETL tools is desirable. Proficiency in Salesforce declarative configuration (Flows, Process Builder, Workflow Rules, etc.) is expected. Experience in deploying Salesforce changes using Salesforce DX, CI/CD processes, and change management tools is an advantage. A strong understanding of security concepts in Salesforce (profiles, permission sets, roles, sharing rules) is required. Effective communication and interpersonal skills are essential for collaborating effectively with cross-functional teams. Hands-on experience with Salesforce Lightning, including Lightning Components and Lightning Experience, is preferred. Salesforce Certifications (e.g., Salesforce Platform Developer) are highly preferred for this role.,
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and maintaining robust data pipelines for extracting, transforming, and loading (ETL) data into the Celonis platform from various source systems. Additionally, you will create and optimize data models within Celonis to support process intelligence activities, ensuring accurate representation of business processes and enabling in-depth analysis. Collaboration with IT and other teams will be essential to integrate data from various sources (ERP, CRM, databases, etc.) into the Celonis platform, ensuring seamless and efficient data flow. Monitoring and optimizing the performance of Celonis data models and pipelines, identifying bottlenecks, and implementing solutions to improve efficiency and scalability will also be part of your responsibilities. It will be your duty to implement data validation and cleansing processes to ensure the accuracy and integrity of the data used for process mining and analysis. Working closely with data analysts and stakeholders to understand their requirements, providing technical support, and training on the use of Celonis for data-driven decision-making will be crucial. You will also be expected to maintain comprehensive documentation of data models, ETL processes, and data pipelines, ensuring transparency and ease of maintenance. Staying up-to-date with the latest developments in process intelligence, data engineering, and Celonis technologies will be important to propose and implement best practices to improve the overall data architecture. Qualifications required for this position include a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. You should have a minimum of 4+ years of proven experience as a Data Engineer, focusing on ETL processes, data modeling, and pipeline optimization. Additionally, you should have at least 2 years of hands-on experience with Celonis, including data integration, model building, building views & action flows with a strong portfolio. Familiarity with ERP systems (e.g., SAP, Oracle) and other enterprise software platforms is preferred. In terms of technical skills, proficiency in SQL and experience with database management systems are required. You should also have a strong knowledge of ETL tools and data warehousing concepts, along with experience in scripting languages (e.g., Python, JavaScript) for data manipulation and automation. Domain knowledge of at least one module in SAP according to requirements is expected. Soft skills necessary for this role include strong analytical and problem-solving skills with attention to detail. Excellent communication and collaboration abilities are essential, along with the capacity to understand and interpret business processes. You should also have the ability to manage multiple tasks and projects, prioritizing effectively in a fast-paced environment.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
nagpur, maharashtra
On-site
The Data & Analytics Team is looking for a Data Engineer with a hybrid skillset in data integration and application development. In this role, you will play a crucial part in designing, engineering, governing, and enhancing our entire Data Platform. This platform serves customers, partners, and employees by providing self-service access. You will showcase your expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles. To be successful in this role, you should have at least 5 years of experience in system/data integration, development, or implementation of enterprise and/or cloud software. You must have strong experience with Web APIs (RESTful and SOAP) and be proficient in setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). Demonstrated proficiency in Python, data wrangling, and query authoring in SQL and NoSQL environments is essential. Experience in a cloud-based computing environment, specifically GCP, is preferred. You should also excel in documenting Business Requirement, Functional & Technical documentation, writing Unit & Functional Test Cases, Test Scripts & Run books, and incident management systems like Jira, Service Now, etc. Working knowledge of Agile Software development methodology is required. As a Data Engineer, you will be responsible for leading system/data integration, development, or implementation efforts for enterprise and/or cloud software. You will design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. Performing extensive data wrangling and authoring complex queries in both SQL and NoSQL environments for structured and unstructured data will be part of your daily tasks. You will develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). Providing operational support for the data platform and applications, including incident management, will also be a key responsibility. Additionally, you will create comprehensive Business Requirement, Functional, and Technical documentation, develop Unit & Functional Test Cases, Test Scripts, and Run Books, and manage incidents effectively using systems like Jira, Service Now, etc. At GlobalLogic, we prioritize a culture of caring. We put people first, offering an inclusive culture where you can build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development, providing various opportunities to grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that make an impact. We believe in balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect work-life balance. Join us in a high-trust organization where integrity is key, and trust is a cornerstone of our values to employees and clients. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest companies, helping create innovative digital products and experiences. You'll have the opportunity to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Salesforce Junior Developer at EY, you will play a crucial role in application design, configuration, testing, and deployment on the Salesforce.com platform. Your responsibilities will include participating in all aspects of the development lifecycle, from solution definition to project planning. Additionally, you will provide technical assistance, end-user troubleshooting, and mentorship to a team of junior developers. With 2+ years of experience working on Salesforce platforms and a Salesforce Platform Developer I certification, you are expected to have a strong foundation in software engineering skills, including Apex, LWC, SOQL, and unit testing. Your expertise in core web technologies such as HTML5, JavaScript, and jQuery, along with experience in relational databases, data modeling, and ETL tools, will be highly valuable. Your role will involve working on CRM projects for middle market and enterprise-size companies, utilizing web services like REST, SOAP, JSON, and XML. Familiarity with Agile development methodologies like SCRUM is essential. Your excellent verbal and written communication skills will enable you to collaborate effectively with cross-functional teams and clients. Join EY to leverage your unique voice and perspective in contributing to a better working world. Be part of a global organization that values inclusivity, diversity, and innovation, and embark on a rewarding career journey where you can unleash your full potential and make a meaningful impact.,
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You will need to have strong experience in SQL programming, specifically in Oracle PL/SQL and MongoDB. Proficiency in Unix scripting and Python is essential for this role. A solid understanding of web development, particularly in Java and Angular, would be highly advantageous. You should have at least 8 years of experience as a DB Engineer and experience in other ETL tools. It is important that you have experience working within an Agile environment and are familiar with JIRA. Candidates should possess excellent written and oral communication skills. Strong collaboration and organizational skills are also required as you will be expected to be a good team player. Key technical skills for this role include SQL programming (Oracle PL/SQL and MongoDB), Python, Unix scripting, Java, Angular, Agile, and Jira. About Virtusa: Teamwork, quality of life, professional and personal development are the values that Virtusa is proud to embody. When you join Virtusa, you become a part of a global team of 27,000 people who care about your growth. Virtusa seeks to provide exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with the company. At Virtusa, great minds and great potential come together. Collaboration and the team environment are highly valued, providing a dynamic place for great minds to nurture new ideas and foster excellence.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are a highly experienced Microsoft Purview Technical Architect with a minimum of 10 years of experience. You will play a key role in designing and implementing data governance solutions using Microsoft Purview at Tavant Technologies. Working closely with cross-functional teams, you will establish best practices for data management, ensure compliance, and enhance data visibility across the organization. Your responsibilities will include leading the architecture and design of data governance solutions using Microsoft Purview, collaborating with data engineering, data science, and business teams to define data governance policies and standards, implementing and managing data classification, lineage, and cataloging processes, and developing strategies for data quality, data security, and compliance with regulations such as GDPR and CCPA. Additionally, you will conduct training sessions and workshops to educate teams on data governance best practices and the use of Microsoft Purview tools, monitor and optimize the performance of data governance solutions, provide technical leadership and mentorship to junior team members, and ensure that the solutions meet the organization's needs. To excel in this role, you must have extensive experience with Microsoft Azure and Microsoft Purview, a strong knowledge of data governance principles, frameworks, and best practices, proficiency in SQL and data modeling concepts, experience with data visualization tools such as Power BI or Tableau, familiarity with data privacy regulations and compliance requirements, excellent problem-solving and analytical skills, strong communication and interpersonal skills, knowledge of cloud architecture and data architecture principles, experience with ETL tools and data transformation processes is a plus, and the ability to work in a fast-paced, collaborative environment. If you are a passionate technical architect with a deep understanding of Microsoft Purview and a commitment to driving data governance excellence, we invite you to apply and join our dynamic team at Tavant Technologies.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Data Migration developer is responsible for executing and managing data migration projects within Salesforce environments. Your role will involve expertise in data extraction, transformation, and loading (ETL) processes, with a strong focus on leveraging Informatica tools. You will ensure the accurate, secure, and efficient migration of data while customizing Salesforce to align with business processes and objectives. You should have 3-4+ years of experience in database migration, specifically focusing on Salesforce applications and handling sensitive data. Proficiency in using ETL tools like Informatica (PowerCenter, Informatica Cloud), Boomi, or other similar tools for data migration is required. Additionally, experience with Salesforce data import/export tools, SQL, ETL processes, and data integration methodologies is expected. Your expertise in data migration tools and techniques, along with familiarity with Salesforce APIs and integration methods, will be crucial. You will be responsible for migrating and integrating data from different platforms into Salesforce. This includes preparing data migration plans, handling kickouts/fallouts, and developing procedures and scripts for data migration. In this role, you will develop, implement, and optimize stored procedures and functions using TSQL. You will also perform SQL database partitioning and indexing procedures as required to handle heavy traffic loads. A solid understanding of Salesforce architecture and objects, such as accounts, contacts, cases, custom objects, fields, and restrictions, is essential. Hands-on experience in data migration and integration from various platforms into Salesforce is necessary. You should possess the ability to create fast and efficient database queries, including joins with multiple tables, and have good knowledge of SQL optimization techniques. Experience in designing, creating, and maintaining databases, as well as familiarity with MuleSoft, Boomi, or similar integration platforms, is preferred. Preferred qualifications for this role include Salesforce Certified Administrator, Salesforce Certified Platform Developer I or II, and relevant certifications in data management, migration, or related areas.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Technical Manager for Oracle MOM Integrations at 7-Eleven Global Solution Center, you will play a vital role in driving the success of Oracle data integration and analytics teams. Your responsibilities will include designing, developing, and maintaining scalable ETL processes for Oracle Merchandising Operations Management (MOM), Oracle EBS, and Oracle Analytics Reporting. Collaborating with cross-functional teams, you will ensure the alignment of data needs and adherence to agile practices and good software development principles. With over 10 years of experience managing enterprise integration teams and a strong technical background in ETL tools such as ODI, Informatica, or Data Stage, you will lead with effective leadership, mentorship, and a focus on high performance. Your ability to manage technical staff, gather requirements, and identify gaps will be crucial in ensuring the success of Oracle data integration projects. In addition to your technical skills, your soft skills including strong analytical abilities, excellent communication skills, and the capacity for continued learning will be key to your success in this role. You will work collaboratively across cross-functional teams, demonstrating excellent organization, time management, and customer relations skills. A Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 5 years of experience in Oracle MOM ERP integrations, is required for this position. Experience with Oracle MOM functionality is preferred. 7-Eleven Global Solution Center is committed to diversity in the workplace and offers a comprehensive benefits plan to improve the overall experience of its employees. From work-life balance initiatives to well-being and family protection benefits, the organization aims to support its employees both personally and professionally. Additionally, privileges such as certification and training programs and hassle-free relocation support further enhance the employee experience at 7-Eleven Global Solution Center.,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You have a fantastic opportunity to join Omnicom Media Group as a Financial Systems Developer in either Hyderabad, Bangalore, or Chennai. With 6-10 years of experience and expertise in TM1 (Planning Analytics) with a background in media, you will play a key role in maintaining and enhancing Planning Analytics models. Your responsibilities will include building and maintaining TM1 Rules, Turbo Integrator processes, cubes, dimensions, and automating data loads. You will also collaborate with business users and system administrators to develop solutions that address business and FP&A requirements, as well as integrate Planning Analytics as a data source for business analytics reporting. In addition, you will provide support to end users, monitor system performance, and ensure change management by training end users on basic functionality. About Annalect India: Annalect India is an integral part of Annalect Global and Omnicom Group, a leading global marketing communications company. As part of Omnicom Media Group, you will have the opportunity to work with global advertising agency networks such as OMD, PHD, and Hearts & Science. Annalect India provides stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Business Support Services, Market Research, and Media Services. Qualifications: To excel in this role, you should have hands-on experience as a developer of TM1 (Planning Analytics) with proficiency in design, development, architecture, and system administration. Intermediate to Advanced Excel knowledge is required, and familiarity with Tableau/Power BI is a plus. A solid understanding of finance and financial processes, particularly in the media and advertising industry, is preferred. Additionally, experience with databases, ETL tools, and the ability to compare and validate data sets are essential for success in this role. Your ability to absorb and present complex ideas accurately, gather and analyze end user requirements, and adhere to tight deadlines will be crucial for your success as a Financial Systems Developer at Omnicom Media Group.,
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You will be responsible for playing the role of a Technical Subject Matter Expert (SME) with a focus on integrating Airline systems. Your primary duties will include configuring, designing, and developing Integration components to ensure the accurate and secure flow of data between systems with optimal performance. Additionally, you will collaborate with both internal and external stakeholders to enable integrations, identify areas for improvement, and provide support to the business during and outside of regular working hours. To excel in this role, you must possess a bachelor's degree in computer science, information technology, or a related field. You should have experience in developing and supporting systems that integrate extensively with other systems, a strong understanding of data mapping and transformation techniques, and excellent analytical and problem-solving skills. A good grasp of real-time and batch processing, as well as working knowledge of networking, protocols, Windows, Linux, and Unix operating systems, is essential. Moreover, proficiency in collaboration tools like Teams, Web-ex, and a deep understanding of API and Data integration techniques are required. You should be well-versed in Rest APIs, SOAP APIs, and other API Management Platforms, along with an understanding of ETL tools and Middleware used for data transformations. If you have 8-12 years of experience in a similar role and possess the mentioned skill set, we encourage you to apply for this Technical SME (Integration Specialist) position based in Bangalore. The joining time for this role is between 0-60 days.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |