Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Citrin Cooperman offers a dynamic work environment that fosters professional growth and collaboration. We are continuously seeking talented individuals who bring fresh perspectives, a problem-solving mindset, and sharp technical expertise. Our team of collaborative, innovative professionals is ready to support your professional development. At Citrin Cooperman, we offer competitive compensation and benefits, along with the flexibility to manage your personal and professional life to focus on what matters most to you! As a Financial System Data Integration Senior at Citrin Cooperman, you will play a vital role in supporting the design and development of integrations for clients within Workiva's cloud-based information management platform. Working closely with Citrin Cooperman's Integration Manager, you will be responsible for driving project execution, translating strategic target architecture and business needs into executable designs, and technical system solutions. Your contributions will shape the future of how our clients utilize Workiva's platform to achieve success. Key responsibilities of the role include: - Analyzing requirements to identify optimal use of existing software functionalities for automation solutions - Crafting scalable, flexible, and resilient architectures to address clients" business problems - Supporting end-to-end projects to ensure alignment with original design and objectives - Creating data tables, queries (SQL), ETL logic, and API connections between client source systems and the software platform - Developing technical documentation and identifying technical risks associated with application development - Acting as a visionary in data integration and driving connected data solutions for clients - Providing architectural guidance and recommendations to promote successful technology partner engagements - Mentoring and training colleagues and clients - Communicating extensively with clients to manage expectations and report on project status Required Qualifications: - Bachelor's degree in Computer Science, IT, Management IS, or similar with a minimum of 4 years of experience OR at least 7 years of experience without a degree - Proven ability to lead enterprise-level integration strategy discussions - Expertise with API connectors in ERP Solutions such as SAP, Oracle, NetSuite, etc. - Intermediate proficiency with Python, SQL, JSON, and/or REST - Professional experience with database design, ETL tools, multidimensional reporting software, data warehousing, dashboards, and Excel - Experience in identifying obstacles, managing multiple work streams, and effective communication with technical and non-technical stakeholders Preferred Qualifications: - Experience with Workiva's platform - Understanding of accounting activities - Project management experience and leadership skills - Participation in business development activities - Experience in mentoring and training others At Citrin Cooperman, we are committed to providing exceptional service to clients and acting as positive brand ambassadors. Join us in driving innovation, shaping the future of data integration, and making a meaningful impact on our clients" success.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Data Migration developer is responsible for executing and managing data migration projects within Salesforce environments. This role requires expertise in data extraction, transformation, and loading (ETL) processes, with a strong focus on leveraging Informatica tools. The specialist will ensure the accurate, secure, and efficient migration of data while customizing Salesforce to align with business processes and objectives. You should possess 3-4+ years of experience in database migration, with a focus on Salesforce applications and handling sensitive data. Proficiency in using ETL tools like Informatica (PowerCenter, Informatica Cloud), Boomi, or other tools for data migration is required. Experience with Salesforce data import/export tools, SQL, ETL processes, and data integration methodologies is essential. Expertise in data migration tools and techniques, along with familiarity with Salesforce APIs and integration methods, is crucial. You will be responsible for migrating and integrating data from different platforms into Salesforce, preparing data migration plans, handling kickouts/fallouts, and developing procedures and scripts for data migration. Additionally, you will need to develop, implement, and optimize stored procedures and functions using TSQL, as well as perform SQL database partitioning and indexing procedures to handle heavy traffic loads. A solid understanding of Salesforce architecture and objects such as accounts, contacts, cases, custom objects, fields, and restrictions is necessary. Hands-on experience in data migration and integration from different platforms into Salesforce is expected. The ability to create fast and efficient database queries, including joins with several tables, and good knowledge of SQL optimization techniques are important. Experience in designing, creating, and maintaining databases, as well as familiarity with MuleSoft, Boomi, or similar integration platforms, and automating processes within Salesforce are desirable qualifications. Preferred qualifications include Salesforce Certified Administrator, Salesforce Certified Platform Developer I or II, and relevant certifications in data management, migration, or related areas.,
Posted 1 month ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an experienced IT professional with a passion for data and technology, your role will involve ensuring that data accurately reflects business requirements and targets. Collaborating closely with the Procurement & Logistic department and external providers in an agile environment, you will leverage your deep understanding of technology stack capabilities to facilitate engagements and solve impediments for delivering data use cases to drive business value and contribute to the vision of becoming a data-driven company. You will play a crucial role in the energy transformation at Siemens Energy ABP Procurement team, working alongside a diverse team of innovative and hardworking data enthusiasts and AI professionals. Your impact will be significant, with responsibilities including service operation and end-to-end delivery management, interacting with business users and key collaborators, developing and maintaining data architecture and governance standards, designing optimized data architecture frameworks, providing guidance to developers, ensuring data quality, and collaborating with various functions to translate user requirements into technical specifications. To excel in this role, you should bring 8 to 10 years of IT experience with a focus on ETL tools and platforms, proficiency in Snowflake SQL Scripting, JavaScript, PL/SQL, and data modeling for relational databases. Experience in data warehousing, data migration, building data pipelines, and working with AWS, Azure & GCP data services is essential. Additionally, familiarity with Qlik, Power BI, and a degree in computer science or IT are preferred. Strong English skills, intercultural communication abilities, and a background in international collaboration are also key requirements. Joining the Value Center ERP team at Siemens Energy, you will be part of a dynamic group dedicated to driving digital transformation in manufacturing and contributing to the achievement of Siemens Energy's objectives. This role offers the opportunity to work on innovative projects that have a substantial impact on the business and industry, enabling you to be a part of the energy transition and the future of sustainable energy solutions. Siemens Energy is a global leader in energy technology, with a commitment to sustainability and innovation. With a diverse team of over 100,000 employees worldwide, we are dedicated to meeting the energy demands of the future in a reliable and sustainable manner. By joining Siemens Energy, you will contribute to the development of energy systems that drive the energy transition and shape the future of electricity generation. Diversity and inclusion are at the core of Siemens Energy's values, celebrating uniqueness and creativity across over 130 nationalities. The company provides employees with benefits such as Medical Insurance and Meal Card options, supporting a healthy work-life balance and overall well-being. If you are ready to make a difference in the energy sector and be part of a global team committed to sustainable energy solutions, Siemens Energy offers a rewarding and impactful career opportunity.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidates with Databricks certification are preferred. Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: - Bachelors degree in computer science or equivalent. - Minimum of 5 years of hands-on experience with Databricks. - Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). - Well-versed in data warehousing and data lake concepts. - Proficient in SQL and Python for data manipulation and analysis. - Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. - Excellent problem-solving skills. - Strong communication and collaboration skills. Preferred Qualifications: - Certified Databricks Engineer. - Experience in the life sciences domain. - Familiarity with Reltio or similar MDM (Master Data Management) tools. - Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights, and healthcare intelligence to the life sciences and healthcare industries. They create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com.,
Posted 1 month ago
5.0 - 15.0 years
0 Lacs
karnataka
On-site
The role of Talend Developer and Architect at our company involves designing, developing, testing, and deploying integration processes using Talend. Your responsibilities will include collaborating with team members to understand requirements, coding, debugging, and optimizing code for performance. You will also be involved in maintaining documentation for processes and contributing to technological improvement initiatives. As a Talend Developer and Architect, you will design and develop robust data integration solutions using Talend Studio to meet business requirements. You will also be responsible for implementing data governance frameworks and policies, configuring Talend Data Catalog, managing metadata repositories, data quality rules, data dictionaries, and optimizing data pipelines for performance and scalability. To excel in this role, you should have a background in Computer Science, proficiency in Back-End Web Development and Software Development, strong programming skills with an emphasis on Object-Oriented Programming (OOP), and experience with ETL tools, particularly Talend. Excellent analytical and problem-solving skills, along with good communication and teamwork abilities, are essential. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. You will work closely with data stewards, business analysts, data engineers, data scientists, and business stakeholders to understand and fulfill data integration requirements. If you are looking for a challenging opportunity to showcase your skills and contribute to the success of our organization, this role is perfect for you.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The role requires you to understand the business functionalities and technical/database landscapes of the applications under i360. You will collaborate with Database Engineers and Data Analysts to comprehend the requirements and testing needs. Building and maintaining a test automation framework will be a crucial part of your responsibilities, including creating robust and maintainable test cases covering various ETL processes, database systems, and data analyses. Implementing Quality Engineering Strategies, best practices, and guidelines to ensure scalability, reusability, and maintainability is an essential aspect of the role. As part of the position, you will be expected to identify, replicate, and report defects, as well as verify defect fixes. Data accuracy, completeness, and consistency will be validated using ETL tools and SQL queries. Being proactive, adaptable to changes, and possessing strong communication skills (both verbal and written) are key attributes for success in this role. Expertise in DB-related testing and ETL testing, along with strong Python programming skills and proficiency in SQL and ETL tools like pandas and great expectations, are necessary. Knowledge of SQL and experience working with databases such as Redshift, Elasticsearch, OpenSearch, Postgres, and Snowflake is required. Additionally, familiarity with analyzing population data and demographics, version control using Gitlab, pipeline integration, and working under pressure with strong attention to detail are essential qualities for this position. The role also involves contribution motivation, good verbal and written communication skills, mentorship, knowledge sharing, experience with Jira, knowledge of Agile methodology, and hands-on experience in DevOps like Gitlab CI/CD pipeline. If you possess strong analytical, problem-solving, and troubleshooting skills and stay updated on current market trends, this position might be suitable for you. This is a Contractual/Temporary job with a Day shift schedule and an in-person work location. To apply for this position, please send your resumes to gopi@nithminds.com.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
This is a full-time Data Engineer position with D Square Consulting Services Pvt Ltd, based in Pan-India with a hybrid work model. You should have at least 5 years of experience and be able to join immediately. As a Data Engineer, you will be responsible for designing, building, and scaling data pipelines and backend services supporting analytics and business intelligence platforms. A strong technical foundation, Python expertise, API development experience, and familiarity with containerized CI/CD-driven workflows are essential for this role. Your key responsibilities will include designing, implementing, and optimizing data pipelines and ETL workflows using Python tools, building RESTful and/or GraphQL APIs, collaborating with cross-functional teams, containerizing data services with Docker, managing deployments with Kubernetes, developing CI/CD pipelines using GitHub Actions, ensuring code quality, and optimizing data access and transformation. The required skills and qualifications for this role include a Bachelor's or Master's degree in Computer Science or a related field, 5+ years of hands-on experience in data engineering or backend development, expert-level Python skills, experience with building APIs using frameworks like FastAPI, Graphene, or Strawberry, proficiency in Docker, Kubernetes, SQL, and data modeling, good communication skills, familiarity with data orchestration tools, experience with streaming data platforms like Kafka or Spark, knowledge of data governance, security, and observability best practices, and exposure to cloud platforms like AWS, GCP, or Azure. If you are proactive, self-driven, and possess the required technical skills, then this Data Engineer position is an exciting opportunity for you to contribute to the development of cutting-edge data solutions at D Square Consulting Services Pvt Ltd.,
Posted 1 month ago
8.0 - 12.0 years
19 - 22 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for a SQL Developer to join their team in Bangalore, Karnataka, India. As a SQL Developer, you will be responsible for developing complex queries, extracting data using ETL tools, and cleansing and validating both structured and unstructured data. Additionally, you will be involved in creating insurance reports, visualizations, dashboards, and conducting analysis and analytics with a focus on life insurance. The ideal candidate should have a strong proficiency in SQL along with knowledge of tools like EXL, R, and Python. NTT DATA is a global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. They serve 75% of the Fortune Global 100 and have experts in more than 50 countries. If you are a passionate individual with strong SQL skills and a background in life insurance, this is a great opportunity to be part of an inclusive and forward-thinking organization. Apply now to grow with NTT DATA and contribute to their mission of driving innovation and digital transformation.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
panchkula, haryana
On-site
We are seeking a skilled and experienced Lead/Senior ETL Engineer with 4-8 years of experience to join our dynamic data engineering team. As a Lead/Sr. ETL Engineer, you will play a crucial role in designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. Your expertise in ETL tools, cloud platforms, scripting, and data modeling principles will be pivotal in building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills: - Proficiency in ETL tools such as SSIS, DataStage, Informatica, or Talend. - In-depth understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, and Fact & Dimension tables. - Strong experience with relational databases like SQL Server, Oracle, Teradata, DB2, or MySQL. - Solid scripting/programming skills in Python. - Hands-on experience with cloud platforms like AWS or Azure. - Knowledge of middleware architecture and enterprise data integration strategies. - Familiarity with reporting/BI tools such as Tableau and Power BI. - Ability to write and review high and low-level design documents. - Excellent communication skills and the ability to work effectively with cross-cultural, distributed teams. Roles and Responsibilities: - Design and develop ETL workflows and data integration strategies. - Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. - Coach and mentor junior engineers to support skill development and performance. - Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. - Participate in planning, estimations, and recruitment activities. - Work on multiple projects simultaneously, ensuring quality and consistency in delivery. - Experience in Sales and Marketing data domains. - Strong problem-solving abilities with a data-driven mindset. - Ability to work independently and collaboratively in a fast-paced environment. - Prior experience in global implementations and managing multi-location teams is a plus. If you are a passionate Lead/Sr. ETL Engineer looking to make a significant impact in a dynamic environment, we encourage you to apply for this exciting opportunity. Thank you for considering a career with us. We look forward to receiving your application! For further inquiries, please contact us at careers@grazitti.com. Location: Panchkula, India,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
At Improzo, we are dedicated to improving life by empowering our customers through quality-led commercial analytical solutions. Our team of experts in commercial data, technology, and operations collaborates to shape the future and work with leading Life Sciences clients. We prioritize customer success and outcomes, embrace agility and innovation, foster respect and collaboration, and are laser-focused on quality-led execution. As a Data and Reporting Developer (Improzo Level - Associate) at Improzo, you will play a crucial role in designing, developing, and maintaining large-scale data processing systems using big data technologies. You will collaborate with data architects and stakeholders to implement data storage solutions, develop ETL pipelines, integrate various data sources, design and build reports, optimize performance, and ensure seamless data flow. Key Responsibilities: - Design, develop, and maintain scalable data pipelines and big data applications using distributed processing frameworks. - Collaborate on data architecture, storage solutions, ETL pipelines, data lakes, and data warehousing. - Integrate data sources into the big data ecosystem while maintaining data quality. - Design and build reports using tools like Power BI, Tableau, and Microstrategy. - Optimize workflows and queries for high performance and scalability. - Collaborate with cross-functional teams to deliver data solutions that meet business requirements. - Perform testing, quality assurance, and documentation of data pipelines. - Participate in agile development processes and stay up-to-date with big data technologies. Qualifications: - Bachelor's or master's degree in a quantitative field. - 1.5+ years of experience in data management or reporting projects with big data technologies. - Hands-on experience or thorough training in AWS, Azure, GCP, Databricks, and Spark. - Experience in Pharma Commercial setting or Pharma data management is advantageous. - Proficiency in Python, SQL, MDM, Tableau, PowerBI, and other tools. - Excellent communication, presentation, and interpersonal skills. - Attention to detail, quality, and client centricity. - Ability to work independently and as part of a cross-functional team. Benefits: - Competitive salary and benefits package. - Opportunity to work on cutting-edge tech projects in the life sciences industry. - Collaborative and supportive work environment. - Opportunities for professional development and growth.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Salesforce Lightning Developer at Adita Technologies, you will be part of an exciting project based in Australia & USA. Adita is a proud member of the Ayan Group, an Australian conglomerate with headquarters in Sydney, Australia. We are currently seeking experienced Salesforce Lightning Developers to join our team in Delhi, NCR / Chennai. The ideal candidate should have at least 4+ years of experience and possess expertise in Lightning, Sales Cloud, Service Cloud, and Force.com. Title: Salesforce Lightning Developer Location: Noida, Delhi NCR / Chennai Type: Permanent, full time Experience: 4+ Years In this role, you will be expected to: - Have a strong background in CRM platforms, either Functional or Technical, with a focus on Salesforce Lightning components. - Demonstrate proficiency in Sales Cloud, Service Cloud, and the Lightning Platform. - Utilize your expertise in Lightning development, Aura framework, Apex, Visualforce, JavaScript, jQuery, and Angular Js. - Showcase expert knowledge of Salesforce.com's Web services, SOAP, REST, and experience in developing custom integration processes using Salesforce.com's Web Services API. - Work with ETL tools like Starfish ETL, Talend Open Studio, Apex Data Loader, and Pervasive. - Possess familiarity with integrated development environments such as Eclipse. - Exhibit excellent oral and written communication skills, customer interaction skills, and the ability to work collaboratively in a team. - Be self-motivated, analytical, and driven to overcome challenges. Please be aware that only shortlisted candidates will be contacted for this role. We appreciate your interest in joining our team. Role: Technical Developer Industry Type: IT-Software, Software Services Functional Area: IT Software Application Programming, Maintenance Employment Type: Full Time Role Category: System Design/Implementation/ERP/CRM Education: - UG: Graduation Not Required, Any Graduate in Any Specialization - PG: Post Graduation Not Required, Any Postgraduate in Any Specialization - Doctorate: Any Doctorate in Any Specialization, Doctorate Not Required Key Skills: salesforce crm, solution consultants, solution specialists, Solution Design, solution designer, CRM,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
You will be responsible for developing, configuring, and maintaining SAS Fraud Management (SFM) solutions to detect and prevent fraud effectively. This includes integrating SAS SFM with internal and external systems to ensure seamless data flow and analysis. Your role will involve designing, implementing, and fine-tuning fraud detection models to accurately identify fraudulent transactions. Customizing rules, alerts, and workflows within SAS SFM according to organizational requirements is a crucial aspect of your responsibilities. You will analyze large datasets to identify fraud patterns and trends, generating accurate and actionable insights. Monitoring system performance and optimizing SAS SFM applications for efficiency and scalability will be part of your daily tasks. Thorough testing, including unit, system, and UAT, must be conducted to ensure that solutions align with business needs. Adherence to regulatory requirements and organizational standards is essential in all SFM implementations. Collaboration with business, IT teams, and stakeholders is necessary to understand fraud management needs and deliver effective solutions. Your skills in SAS technologies, especially SAS Fraud Management (SFM), will be utilized to the fullest. Proficiency in Base SAS, SAS Macros, SAS Enterprise Guide, and SAS Visual Analytics is required. Experience in data integration using ETL tools, knowledge of database systems (SQL/Oracle/PostgreSQL), and advanced query skills are essential for this role. A strong understanding of fraud detection methodologies, rules, and algorithms will be beneficial. As an IT Graduate (B.E. (IT), B.Sc.(IT), B. Tech, BCA, MCA, M.Tech), you will be expected to create and maintain detailed technical and functional documentation for all SFM implementations. The posted date for this project is January 20, 2025. This is a permanent position requiring an experienced professional with a background in project management.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a detail-oriented and analytical Guidewire PolicyCenter Conversion Data Analyst responsible for supporting data migration initiatives from legacy policy administration systems to Guidewire PolicyCenter. Your role is crucial in ensuring data quality, integrity, and alignment with business and regulatory requirements throughout the conversion lifecycle. Your key responsibilities include analyzing and interpreting legacy data to facilitate its transformation and migration into Guidewire PolicyCenter. You will collaborate with business stakeholders and technical teams to define data mapping, transformation rules, and conversion logic. Working closely with developers, you will create, validate, and refine ETL (Extract, Transform, Load) scripts and data models. Additionally, you will develop and execute data validation, reconciliation, and audit checks to ensure successful conversion. Identifying and resolving data quality issues, discrepancies, or gaps in source and target systems is also part of your role. Documenting data dictionaries, mapping specifications, conversion strategies, and post-migration reporting is essential. You will perform mock conversions, support dry runs, and participate in production cutover activities. Assisting QA teams with test planning and conversion-related test cases, providing support during UAT (User Acceptance Testing) and post-go-live stabilization, and ensuring compliance with data privacy, security, and regulatory requirements are also within your purview. To qualify for this role, you should have at least 3 years of experience in data conversion or data migration, with a minimum of 2 years in Guidewire PolicyCenter projects. Proficiency in SQL for data extraction, analysis, and transformation is required. A solid understanding of the Guidewire PolicyCenter data model and core configuration concepts is essential. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data integration pipelines is preferred. Experience working with legacy policy admin systems (e.g., AS/400, Mainframe, or other proprietary platforms) is beneficial. Strong analytical skills, attention to detail, and the ability to work with large data sets are necessary. Effective communication skills, especially in explaining technical details to non-technical stakeholders, are important. Experience in Agile environments and proficiency in tools like JIRA, Confluence, Excel, and Visio are advantageous. Preferred skills for this role include experience with Guidewire Cloud or PolicyCenter 10.x, prior experience with personal or commercial lines of insurance, knowledge of Gosu, XML, JSON, or API-based data handling, and Guidewire certifications in PolicyCenter or DataHub, which are considered a strong advantage.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Tech Lead at Carelon Global Solutions India, your primary responsibility will be to define solution architecture for applications for an OBA in alignment with enterprise standards and policies. You will serve as a technical subject matter expert for multiple technologies, ensuring adherence to code standards and policies while supporting various application development projects. The ideal candidate for this role should have a BE/MCA qualification with over 10 years of IT experience, including at least 5 years of in-depth knowledge of Elevance Health applications/platforms such as WGS, Facets, SPS, data platforms, Member/Provider communications (Sydney/Solution central), Carelon services (Carelon BH/Carelon RX, etc.). You should possess a good understanding of ETL tools, database concepts, data modeling, ETL best practices, multi-cloud environments (AWS, Azure, GCP), data security protocols, ERP/CRM tools, and integration technologies such as API management, SOA, Microservices, and Kafka topics. Knowledge of EA architecture guidelines and principles will be beneficial for this role. At Carelon Global Solutions, we believe in offering limitless opportunities to our associates, fostering an environment that promotes growth, well-being, and a sense of purpose and belonging. Our focus on learning and development, innovative culture, comprehensive rewards, competitive health insurance, and employee-centric policies make Life @ Carelon enriching and fulfilling. We are an equal opportunity employer committed to diversity and inclusion, and we provide reasonable accommodations to ensure a supportive work environment for all. If you require assistance due to a disability, please request the Reasonable Accommodation Request Form. Join us on this exciting journey at Carelon Global Solutions and be a part of our mission to simplify healthcare and improve lives and communities.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
As a Technical Consultant / Technical Architect with Fund Accounting experience and proficiency in Oracle and Informatica, your primary responsibility will be to collaborate with Delivery Managers, System/Business Analysts, and other subject matter experts to comprehend project requirements. Your role will involve designing solutions, providing effort estimation for new projects/proposals, and developing technical specifications and unit test cases for the interfaces under development. You will be expected to establish and implement standards, procedures, and best practices for data maintenance, reconciliation, and exception management. Your technical leadership skills will be crucial in proposing solutions, estimating projects, and guiding/mentoring junior team members in developing solutions on the GFDR platform. Key Requirements: - 10-12 years of experience in technical leadership within data warehousing and Business Intelligence fields - Proficiency in Oracle SQL/PLSQL and Stored Procedures - Familiarity with Source Control Tools, preferably Clear Case - Sound understanding of Data Warehouse, Datamart, and ODS concepts - Experience in UNIX and PERL scripting - Proficiency in standard ETL tools like Informatica Power Centre - Technical leadership in Eagle, Oracle, Unix Scripting, Perl, and job scheduling tools like Autosys/Control - Strong knowledge of data modeling, data normalization, and performance optimization techniques - Exposure to fund accounting concepts/systems and master data management is desirable - Ability to work collaboratively with cross-functional teams and provide guidance to junior team members - Excellent interpersonal and communication skills - Willingness to work both in development and production support activities Industry: IT/Computers-Software Role: Technical Architect Key Skills: Oracle, PL/SQL, Informatica, Autosys/Control, Fund Accounting, Eagle Education: B.E/B.Tech Email ID: jobs@augustainfotech.com If you meet the specified requirements and are passionate about delivering innovative solutions in a collaborative environment, we encourage you to apply for this exciting opportunity.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be reporting to the Senior Director of SaaS and will be responsible for customizing, developing, and supporting solutions on the Salesforce and ServiceNow platforms. The ideal candidate should possess a strong understanding of Salesforce.com and ServiceNow platforms, with basic to intermediate knowledge of integrations and security. Having an interest and ability to understand problems, design solutions, and execute critical paths is essential. Additionally, exceptional technical, analytical, and problem-solving skills are required, along with the ability to interact with all levels of the organization. We are looking for a self-starter with a proactive approach who can identify and suggest process improvements. Your responsibilities will include: - Daily administration of the ServiceNow system, such as implementing approved changes to forms, tables, reports, and workflows - Creating and customizing reports, homepages, and dashboards in ServiceNow - Ensuring the ServiceNow platform remains up-to-date by testing and installing updates, patches, and new releases - Designing and developing advanced ServiceNow customizations - Troubleshooting multiple integrations with ServiceNow and Rally - Managing ServiceNow security by overseeing roles and access control lists - Providing training to personnel on ServiceNow usage and processes, including creating supporting documentation - Collaborating with end-users to resolve support issues within ServiceNow - Conducting code reviews and developing, configuring, testing, and deploying solutions on the Salesforce platform - Configuring, designing, ensuring functionality, and providing end-user support on the Force.com platform - Implementing solutions in an agile environment, delivering high-quality code and configurations - Managing workflows, process builders, assignment rules, email templates, and other features - Handling data imports and exports, customizing objects, fields, reports, and 3rd party apps - Managing users, profiles, permission sets, security, and other administrative tasks - Leading testing of various functionalities, creating test data, test plans, and conducting feature testing - Demonstrating solutions to users, providing training, and documenting as necessary - Offering ongoing support and system administration to quickly resolve production issues - Mapping functional requirements to Salesforce.com features and functionalities - Implementing change control and best practices for system maintenance, configuration, development, testing, and data integrity - Utilizing Sales Cloud, ServiceNow, and Salesforce Community hands-on experience - Having a programming background to develop custom code using Visualforce, Apex, Lightning, and JavaScript as needed - Knowing when to use out-of-the-box functionality versus custom code We are looking for candidates who have: - Excellent listening, analytical, organizational, and time management skills - Strong written and oral communication skills, demonstrating diplomacy and professionalism - A strong work ethic, customer service mentality, and ability to work under pressure - Team player mindset with the ability to work cross-functionally, be self-driven, and motivated - Capability to work independently, lead projects of moderate complexity, and identify areas for process improvement - Creativity, problem-solving skills, and the ability to develop effective relationships with various stakeholders - Prioritization skills to meet deadlines in a fast-paced environment and embrace change - Bachelor's Degree in Computer Science or a related technical field, or equivalent experience - 3+ years of hands-on experience developing Salesforce and ServiceNow - Proficiency in Salesforce and ServiceNow programmatic features - Ability to dig into data, surface actionable insights, demonstrate sound judgment and decision-making skills - Experience with Data Loader and other data loading tools, MS Excel, and Database modeling - Additional knowledge in ServiceNow, Community, CPQ, Marketo, and other integrations is a plus - Proactivity, ability to manage changing priorities and workload efficiently Additional Information: - The recruitment process includes online assessments, which will be sent via email - The position is based in Pune office.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing, building, and maintaining scalable ETL pipelines using Java and SQL-based frameworks. Your role involves extracting data from various structured and unstructured sources, transforming it into formats suitable for analytics and reporting, and collaborating with data scientists, analysts, and business stakeholders to gather data requirements and optimize data delivery. Additionally, you will develop and maintain data models, databases, and data integration solutions, while monitoring data pipelines and troubleshooting data issues to ensure data quality and integrity. Your expertise in Java for backend/ETL development and proficiency in SQL for data manipulation, querying, and performance tuning will be crucial in this role. You should have hands-on experience with ETL tools such as Apache NiFi, Talend, Informatica, or custom-built ETL pipelines, along with familiarity with relational databases like PostgreSQL, MySQL, Oracle, and data warehousing concepts. Experience with version control systems like Git is also required. Furthermore, you will be responsible for optimizing data flow and pipeline architecture for performance and scalability, documenting data flow diagrams, ETL processes, and technical specifications, and ensuring adherence to security, governance, and compliance standards related to data. To qualify for this position, you should hold a Bachelor's degree in computer science, Information Systems, Engineering, or a related field, along with at least 5 years of professional experience as a Data Engineer or in a similar role. Your strong technical skills and practical experience in data engineering will be essential in successfully fulfilling the responsibilities of this role.,
Posted 1 month ago
0.0 - 3.0 years
0 Lacs
karnataka
On-site
You are invited to apply for the position of Software Engineer - Sales Force based in Bellandur, Bangalore. This is a fantastic opportunity to join our team, and here are the key details: Location: Bellandur, Bangalore Work Schedule: 5 Days a Week Experience Range: 0-3 years (Freshers can also apply) As a Software Engineer - Sales Force, you will need to have the following technical skills: - Proficiency in frontend development using JavaScript and LWC - Expertise in backend development using Apex, Flows, and Async Apex - Familiarity with Database concepts: SOQL, SOSL, and SQL - Hands-on experience in API integration using SOAP, REST API, and graphql - Knowledge of ETL tools, Data migration, and Data governance - Proficiency in Apex Design Patterns, Integration Patterns, and Apex testing framework - Experience in agile development using CI-CD tools like Azure Devops, gitlab, and bitbucket - Working knowledge of programming languages such as Java, Python, or C++ with a solid understanding of data structures is preferred. In addition to the technical skills, the following qualifications are required: - Bachelor's degree in engineering - Experience in developing with India stack - Background in fintech or banking domain If you meet the specified requirements and are prepared to embrace a new challenge, we are excited to review your application. To express your interest, please send your resume to pooja@jmsadvisory.in This position offers the following job types: Full-time, Fresher, Contractual / Temporary Contract length: 6-12 months Benefits include: - Health insurance - Paid sick time - Provident Fund Work Schedule: - Day shift - Fixed shift - Monday to Friday - Morning shift - Weekend availability Work Location: In person We look forward to potentially welcoming you to our team!,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EY's financial services practice offers integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY's Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for a candidate with 10-12 years of expertise in data science, data analysis, and visualization skills. Act as a Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects. Your key responsibilities include: - Understanding of insurance domain knowledge (PnC or life or both) - Being responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL) - Overseeing and governing the expansion of existing data architecture and the optimization of data query performance via best practices - Working independently and collaboratively - Implementing business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) - Working with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Defining and governing data modeling and design standards, tools, best practices, and related development for enterprise data models - Identifying the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC - Working proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills and attributes for success include: - Strong communication, presentation, and team-building skills - Experience in executing and managing research and analysis of companies and markets - BE/BTech/MCA/MBA with 8 - 12 years of industry experience with machine learning, visualization, data science, and related offerings - At least around 4-8 years of experience in BI and Analytics - Ability to do end-to-end data solutions from analysis, mapping, profiling, ETL architecture, and data modeling - Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality - Good experience using CA Erwin or other similar modeling tools - Strong knowledge of relational and dimensional data modeling concepts - Experience in data management analysis - Experience with unstructured data is an added advantage - Ability to effectively visualize and communicate analysis results - Experience with big data and cloud preferred - Experience, interest, and adaptability to working in an Agile delivery environment. Ideally, you'll also have: - Good exposure to any ETL tools - Good to have knowledge about P&C insurance - Must have led a team size of at least 4 members - Experience in Insurance and Banking domain - Prior client-facing skills, self-motivated, and collaborative. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As a Data Migration Developer, you will play a crucial role in managing the integration and migration of FIS Commercial Lending Solution (CLS) as part of a strategic transformation initiative for the Back Office. Your main responsibilities will include contributing to migration activities from legacy lending platforms to FIS CLS, collaborating with business analysts to implement tools for data collection, preparation, mapping, and loading, testing migration loading to CLS in a controlled environment, and supporting end-to-end dress rehearsals and production migration. To excel in this role, you should have a minimum of 8 years of relevant experience in software engineering, with at least 4 years in ETL tools, and ideally in data migration of financial data. The ideal candidate will be an experienced technology professional with a proven track record of solving complex technical challenges. Your technical skills should encompass data migration & ETL, strong experience with end-to-end data migration processes, hands-on experience with ETL tools such as Informatica, Talend, SSIS, a strong command of SQL, system integration & API work, testing & quality assurance, technical documentation & change management, DevOps & automation, and security & compliance. Additionally, functional skills such as knowledge of the commercial loan lifecycle or basic banking experience, strong hands-on experience with FIS Commercial Lending Solutions, experience with banking core systems and integrations, a good understanding of SDLC and Agile Scrum practices, and soft skills including leadership, problem-solving, communication, and collaboration will be essential for success in this role. In summary, as a Data Migration Developer, you will be at the forefront of ensuring a seamless integration and migration process for FIS CLS, leveraging your technical expertise, problem-solving skills, and collaborative approach to drive the success of this strategic transformation initiative for the Back Office.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
navi mumbai, maharashtra
On-site
You will be responsible for developing, configuring, and maintaining SAS Fraud Management (SFM) solutions to detect and prevent fraud. Your role will involve integrating SAS SFM with internal and external systems to ensure seamless data flow and analysis. Additionally, you will design, implement, and fine-tune fraud detection models to effectively identify fraudulent transactions. Customizing rules, alerts, and workflows within SAS SFM to align with organizational requirements will be a key aspect of your responsibilities. Analyzing large datasets to identify fraud patterns and trends, ensuring accurate and actionable insights, is a crucial part of this role. You will also be required to monitor system performance and optimize SAS SFM applications for efficiency and scalability. Thorough testing, including unit, system, and UAT, will be conducted by you to ensure that the solutions meet business needs. It is essential to ensure that the solutions comply with regulatory requirements and organizational standards. Moreover, creating and maintaining detailed technical and functional documentation for all SFM implementations will be part of your routine tasks. Collaborating closely with business, IT teams, and stakeholders to comprehend fraud management needs and deliver appropriate solutions is another key responsibility. Skills Required: - Proficiency in SAS technologies, especially SAS Fraud Management (SFM). - Strong command over Base SAS, SAS Macros, SAS Enterprise Guide, and SAS Visual Analytics. - Experience in data integration utilizing ETL tools and methods. - Knowledge of database systems such as SQL, Oracle, and PostgreSQL, along with advanced query skills. - Solid understanding of fraud detection methodologies, rules, and algorithms. Educational Qualification: - IT Graduates (B.E. (IT), B.Sc.(IT), B. Tech, BCA, MCA, M.Tech) This is a permanent position requiring an experienced individual in the field. The role was posted on January 20, 2025.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You have a great opportunity to join our team as a Financial Systems Developer with a focus on maintaining and enhancing Planning Analytics (TM1) models. In this role, you will also be involved in testing, monitoring performance, addressing user support issues, and building financial and managerial reports using business intelligence reporting tools. We are looking for someone with knowledge of databases, ETL tools, and a basic understanding of business and financial processes. Experience in media and advertising is considered a plus. As a Financial Systems Developer, your main responsibilities will include building and maintaining TM1 Rules, Turbo Integrator processes, cubes, dimensions, and automation of data loads. You will update and troubleshoot TM1 security models, collaborate with our Planning Analytics Systems Manager, and work with business users and system administrators to develop and refine solutions. Additionally, you will assist in integrating Planning Analytics as a data source for business analytics reporting, maintain data models, test and validate results, act as the first line of support for user issues, and monitor system performance. To qualify for this role, you should have experience as a hands-on developer of TM1 (Planning Analytics), intermediate to advanced Excel knowledge, and familiarity with Tableau/Power BI is a plus. You should possess solid business judgment, an understanding of finance and financial processes, experience with databases and ETL tools, the ability to compare and validate data sets, absorb and present complex ideas quickly and accurately, gather and analyze end user requirements, and adhere to tight deadlines. If you are looking to be part of a dynamic team in the media and advertising industry, this is the perfect opportunity for you. Join us in our mission to provide stellar products and services in areas of Creative Services, Technology, Marketing Science, Business Support Services, Market Research, and Media Services.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Tech Lead at Carelon Global Solutions India, you will play a crucial role in defining the solution architecture for applications while ensuring alignment with enterprise standards and strategic direction. Your responsibilities will involve serving as a technical subject matter expert for multiple technologies, adhering to code standards and policies, and supporting various application development projects. To excel in this role, you will need to have a BE/MCA qualification along with over 10 years of IT experience, including a deep understanding of Elevance Health applications/platforms such as WGS, Facets, SPS, Data platforms, and Member/Provider communications. Additionally, familiarity with ETL tools, database concepts, data modeling, multi-cloud environments (AWS, Azure, GCP), data security protocols, ERP/CRM tools, and integration technologies like API management, SOA, Microservices, and Kafka topics will be essential. At Carelon Global Solutions, we are committed to improving lives and communities by simplifying healthcare. Our values of Leadership, Community, Integrity, Agility, and Diversity guide us in achieving our mission. Joining our team means entering a world of limitless opportunities where growth, well-being, purpose, and belonging are nurtured. We offer extensive learning and development opportunities, foster an innovative and creative culture, prioritize holistic well-being, and provide competitive health and medical insurance coverage. As an equal opportunity employer, we celebrate diversity and inclusivity in our workforce and work styles. If you require any accommodations due to a disability, we encourage you to request the Reasonable Accommodation Request Form during the application process. Join us at Carelon, where your potential is limitless and your contributions make a meaningful impact on the healthcare industry.,
Posted 1 month ago
3.0 - 8.0 years
9 - 14 Lacs
Gurugram
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |