Jobs
Interviews

157 Etl Tools Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Salesforce Lightning Developer at Adita Technologies, you will be part of an exciting project based in Australia & USA. Adita is a proud member of the Ayan Group, an Australian conglomerate with headquarters in Sydney, Australia. We are currently seeking experienced Salesforce Lightning Developers to join our team in Delhi, NCR / Chennai. The ideal candidate should have at least 4+ years of experience and possess expertise in Lightning, Sales Cloud, Service Cloud, and Force.com. Title: Salesforce Lightning Developer Location: Noida, Delhi NCR / Chennai Type: Permanent, full time Experience: 4+ Years In this role, you will be expected to: - Have a strong background in CRM platforms, either Functional or Technical, with a focus on Salesforce Lightning components. - Demonstrate proficiency in Sales Cloud, Service Cloud, and the Lightning Platform. - Utilize your expertise in Lightning development, Aura framework, Apex, Visualforce, JavaScript, jQuery, and Angular Js. - Showcase expert knowledge of Salesforce.com's Web services, SOAP, REST, and experience in developing custom integration processes using Salesforce.com's Web Services API. - Work with ETL tools like Starfish ETL, Talend Open Studio, Apex Data Loader, and Pervasive. - Possess familiarity with integrated development environments such as Eclipse. - Exhibit excellent oral and written communication skills, customer interaction skills, and the ability to work collaboratively in a team. - Be self-motivated, analytical, and driven to overcome challenges. Please be aware that only shortlisted candidates will be contacted for this role. We appreciate your interest in joining our team. Role: Technical Developer Industry Type: IT-Software, Software Services Functional Area: IT Software Application Programming, Maintenance Employment Type: Full Time Role Category: System Design/Implementation/ERP/CRM Education: - UG: Graduation Not Required, Any Graduate in Any Specialization - PG: Post Graduation Not Required, Any Postgraduate in Any Specialization - Doctorate: Any Doctorate in Any Specialization, Doctorate Not Required Key Skills: salesforce crm, solution consultants, solution specialists, Solution Design, solution designer, CRM,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You will be responsible for developing, configuring, and maintaining SAS Fraud Management (SFM) solutions to detect and prevent fraud effectively. This includes integrating SAS SFM with internal and external systems to ensure seamless data flow and analysis. Your role will involve designing, implementing, and fine-tuning fraud detection models to accurately identify fraudulent transactions. Customizing rules, alerts, and workflows within SAS SFM according to organizational requirements is a crucial aspect of your responsibilities. You will analyze large datasets to identify fraud patterns and trends, generating accurate and actionable insights. Monitoring system performance and optimizing SAS SFM applications for efficiency and scalability will be part of your daily tasks. Thorough testing, including unit, system, and UAT, must be conducted to ensure that solutions align with business needs. Adherence to regulatory requirements and organizational standards is essential in all SFM implementations. Collaboration with business, IT teams, and stakeholders is necessary to understand fraud management needs and deliver effective solutions. Your skills in SAS technologies, especially SAS Fraud Management (SFM), will be utilized to the fullest. Proficiency in Base SAS, SAS Macros, SAS Enterprise Guide, and SAS Visual Analytics is required. Experience in data integration using ETL tools, knowledge of database systems (SQL/Oracle/PostgreSQL), and advanced query skills are essential for this role. A strong understanding of fraud detection methodologies, rules, and algorithms will be beneficial. As an IT Graduate (B.E. (IT), B.Sc.(IT), B. Tech, BCA, MCA, M.Tech), you will be expected to create and maintain detailed technical and functional documentation for all SFM implementations. The posted date for this project is January 20, 2025. This is a permanent position requiring an experienced professional with a background in project management.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a detail-oriented and analytical Guidewire PolicyCenter Conversion Data Analyst responsible for supporting data migration initiatives from legacy policy administration systems to Guidewire PolicyCenter. Your role is crucial in ensuring data quality, integrity, and alignment with business and regulatory requirements throughout the conversion lifecycle. Your key responsibilities include analyzing and interpreting legacy data to facilitate its transformation and migration into Guidewire PolicyCenter. You will collaborate with business stakeholders and technical teams to define data mapping, transformation rules, and conversion logic. Working closely with developers, you will create, validate, and refine ETL (Extract, Transform, Load) scripts and data models. Additionally, you will develop and execute data validation, reconciliation, and audit checks to ensure successful conversion. Identifying and resolving data quality issues, discrepancies, or gaps in source and target systems is also part of your role. Documenting data dictionaries, mapping specifications, conversion strategies, and post-migration reporting is essential. You will perform mock conversions, support dry runs, and participate in production cutover activities. Assisting QA teams with test planning and conversion-related test cases, providing support during UAT (User Acceptance Testing) and post-go-live stabilization, and ensuring compliance with data privacy, security, and regulatory requirements are also within your purview. To qualify for this role, you should have at least 3 years of experience in data conversion or data migration, with a minimum of 2 years in Guidewire PolicyCenter projects. Proficiency in SQL for data extraction, analysis, and transformation is required. A solid understanding of the Guidewire PolicyCenter data model and core configuration concepts is essential. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data integration pipelines is preferred. Experience working with legacy policy admin systems (e.g., AS/400, Mainframe, or other proprietary platforms) is beneficial. Strong analytical skills, attention to detail, and the ability to work with large data sets are necessary. Effective communication skills, especially in explaining technical details to non-technical stakeholders, are important. Experience in Agile environments and proficiency in tools like JIRA, Confluence, Excel, and Visio are advantageous. Preferred skills for this role include experience with Guidewire Cloud or PolicyCenter 10.x, prior experience with personal or commercial lines of insurance, knowledge of Gosu, XML, JSON, or API-based data handling, and Guidewire certifications in PolicyCenter or DataHub, which are considered a strong advantage.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Tech Lead at Carelon Global Solutions India, your primary responsibility will be to define solution architecture for applications for an OBA in alignment with enterprise standards and policies. You will serve as a technical subject matter expert for multiple technologies, ensuring adherence to code standards and policies while supporting various application development projects. The ideal candidate for this role should have a BE/MCA qualification with over 10 years of IT experience, including at least 5 years of in-depth knowledge of Elevance Health applications/platforms such as WGS, Facets, SPS, data platforms, Member/Provider communications (Sydney/Solution central), Carelon services (Carelon BH/Carelon RX, etc.). You should possess a good understanding of ETL tools, database concepts, data modeling, ETL best practices, multi-cloud environments (AWS, Azure, GCP), data security protocols, ERP/CRM tools, and integration technologies such as API management, SOA, Microservices, and Kafka topics. Knowledge of EA architecture guidelines and principles will be beneficial for this role. At Carelon Global Solutions, we believe in offering limitless opportunities to our associates, fostering an environment that promotes growth, well-being, and a sense of purpose and belonging. Our focus on learning and development, innovative culture, comprehensive rewards, competitive health insurance, and employee-centric policies make Life @ Carelon enriching and fulfilling. We are an equal opportunity employer committed to diversity and inclusion, and we provide reasonable accommodations to ensure a supportive work environment for all. If you require assistance due to a disability, please request the Reasonable Accommodation Request Form. Join us on this exciting journey at Carelon Global Solutions and be a part of our mission to simplify healthcare and improve lives and communities.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Technical Consultant / Technical Architect with Fund Accounting experience and proficiency in Oracle and Informatica, your primary responsibility will be to collaborate with Delivery Managers, System/Business Analysts, and other subject matter experts to comprehend project requirements. Your role will involve designing solutions, providing effort estimation for new projects/proposals, and developing technical specifications and unit test cases for the interfaces under development. You will be expected to establish and implement standards, procedures, and best practices for data maintenance, reconciliation, and exception management. Your technical leadership skills will be crucial in proposing solutions, estimating projects, and guiding/mentoring junior team members in developing solutions on the GFDR platform. Key Requirements: - 10-12 years of experience in technical leadership within data warehousing and Business Intelligence fields - Proficiency in Oracle SQL/PLSQL and Stored Procedures - Familiarity with Source Control Tools, preferably Clear Case - Sound understanding of Data Warehouse, Datamart, and ODS concepts - Experience in UNIX and PERL scripting - Proficiency in standard ETL tools like Informatica Power Centre - Technical leadership in Eagle, Oracle, Unix Scripting, Perl, and job scheduling tools like Autosys/Control - Strong knowledge of data modeling, data normalization, and performance optimization techniques - Exposure to fund accounting concepts/systems and master data management is desirable - Ability to work collaboratively with cross-functional teams and provide guidance to junior team members - Excellent interpersonal and communication skills - Willingness to work both in development and production support activities Industry: IT/Computers-Software Role: Technical Architect Key Skills: Oracle, PL/SQL, Informatica, Autosys/Control, Fund Accounting, Eagle Education: B.E/B.Tech Email ID: jobs@augustainfotech.com If you meet the specified requirements and are passionate about delivering innovative solutions in a collaborative environment, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be reporting to the Senior Director of SaaS and will be responsible for customizing, developing, and supporting solutions on the Salesforce and ServiceNow platforms. The ideal candidate should possess a strong understanding of Salesforce.com and ServiceNow platforms, with basic to intermediate knowledge of integrations and security. Having an interest and ability to understand problems, design solutions, and execute critical paths is essential. Additionally, exceptional technical, analytical, and problem-solving skills are required, along with the ability to interact with all levels of the organization. We are looking for a self-starter with a proactive approach who can identify and suggest process improvements. Your responsibilities will include: - Daily administration of the ServiceNow system, such as implementing approved changes to forms, tables, reports, and workflows - Creating and customizing reports, homepages, and dashboards in ServiceNow - Ensuring the ServiceNow platform remains up-to-date by testing and installing updates, patches, and new releases - Designing and developing advanced ServiceNow customizations - Troubleshooting multiple integrations with ServiceNow and Rally - Managing ServiceNow security by overseeing roles and access control lists - Providing training to personnel on ServiceNow usage and processes, including creating supporting documentation - Collaborating with end-users to resolve support issues within ServiceNow - Conducting code reviews and developing, configuring, testing, and deploying solutions on the Salesforce platform - Configuring, designing, ensuring functionality, and providing end-user support on the Force.com platform - Implementing solutions in an agile environment, delivering high-quality code and configurations - Managing workflows, process builders, assignment rules, email templates, and other features - Handling data imports and exports, customizing objects, fields, reports, and 3rd party apps - Managing users, profiles, permission sets, security, and other administrative tasks - Leading testing of various functionalities, creating test data, test plans, and conducting feature testing - Demonstrating solutions to users, providing training, and documenting as necessary - Offering ongoing support and system administration to quickly resolve production issues - Mapping functional requirements to Salesforce.com features and functionalities - Implementing change control and best practices for system maintenance, configuration, development, testing, and data integrity - Utilizing Sales Cloud, ServiceNow, and Salesforce Community hands-on experience - Having a programming background to develop custom code using Visualforce, Apex, Lightning, and JavaScript as needed - Knowing when to use out-of-the-box functionality versus custom code We are looking for candidates who have: - Excellent listening, analytical, organizational, and time management skills - Strong written and oral communication skills, demonstrating diplomacy and professionalism - A strong work ethic, customer service mentality, and ability to work under pressure - Team player mindset with the ability to work cross-functionally, be self-driven, and motivated - Capability to work independently, lead projects of moderate complexity, and identify areas for process improvement - Creativity, problem-solving skills, and the ability to develop effective relationships with various stakeholders - Prioritization skills to meet deadlines in a fast-paced environment and embrace change - Bachelor's Degree in Computer Science or a related technical field, or equivalent experience - 3+ years of hands-on experience developing Salesforce and ServiceNow - Proficiency in Salesforce and ServiceNow programmatic features - Ability to dig into data, surface actionable insights, demonstrate sound judgment and decision-making skills - Experience with Data Loader and other data loading tools, MS Excel, and Database modeling - Additional knowledge in ServiceNow, Community, CPQ, Marketo, and other integrations is a plus - Proactivity, ability to manage changing priorities and workload efficiently Additional Information: - The recruitment process includes online assessments, which will be sent via email - The position is based in Pune office.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable ETL pipelines using Java and SQL-based frameworks. Your role involves extracting data from various structured and unstructured sources, transforming it into formats suitable for analytics and reporting, and collaborating with data scientists, analysts, and business stakeholders to gather data requirements and optimize data delivery. Additionally, you will develop and maintain data models, databases, and data integration solutions, while monitoring data pipelines and troubleshooting data issues to ensure data quality and integrity. Your expertise in Java for backend/ETL development and proficiency in SQL for data manipulation, querying, and performance tuning will be crucial in this role. You should have hands-on experience with ETL tools such as Apache NiFi, Talend, Informatica, or custom-built ETL pipelines, along with familiarity with relational databases like PostgreSQL, MySQL, Oracle, and data warehousing concepts. Experience with version control systems like Git is also required. Furthermore, you will be responsible for optimizing data flow and pipeline architecture for performance and scalability, documenting data flow diagrams, ETL processes, and technical specifications, and ensuring adherence to security, governance, and compliance standards related to data. To qualify for this position, you should hold a Bachelor's degree in computer science, Information Systems, Engineering, or a related field, along with at least 5 years of professional experience as a Data Engineer or in a similar role. Your strong technical skills and practical experience in data engineering will be essential in successfully fulfilling the responsibilities of this role.,

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

You are invited to apply for the position of Software Engineer - Sales Force based in Bellandur, Bangalore. This is a fantastic opportunity to join our team, and here are the key details: Location: Bellandur, Bangalore Work Schedule: 5 Days a Week Experience Range: 0-3 years (Freshers can also apply) As a Software Engineer - Sales Force, you will need to have the following technical skills: - Proficiency in frontend development using JavaScript and LWC - Expertise in backend development using Apex, Flows, and Async Apex - Familiarity with Database concepts: SOQL, SOSL, and SQL - Hands-on experience in API integration using SOAP, REST API, and graphql - Knowledge of ETL tools, Data migration, and Data governance - Proficiency in Apex Design Patterns, Integration Patterns, and Apex testing framework - Experience in agile development using CI-CD tools like Azure Devops, gitlab, and bitbucket - Working knowledge of programming languages such as Java, Python, or C++ with a solid understanding of data structures is preferred. In addition to the technical skills, the following qualifications are required: - Bachelor's degree in engineering - Experience in developing with India stack - Background in fintech or banking domain If you meet the specified requirements and are prepared to embrace a new challenge, we are excited to review your application. To express your interest, please send your resume to pooja@jmsadvisory.in This position offers the following job types: Full-time, Fresher, Contractual / Temporary Contract length: 6-12 months Benefits include: - Health insurance - Paid sick time - Provident Fund Work Schedule: - Day shift - Fixed shift - Monday to Friday - Morning shift - Weekend availability Work Location: In person We look forward to potentially welcoming you to our team!,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EY's financial services practice offers integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY's Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for a candidate with 10-12 years of expertise in data science, data analysis, and visualization skills. Act as a Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects. Your key responsibilities include: - Understanding of insurance domain knowledge (PnC or life or both) - Being responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL) - Overseeing and governing the expansion of existing data architecture and the optimization of data query performance via best practices - Working independently and collaboratively - Implementing business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) - Working with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Defining and governing data modeling and design standards, tools, best practices, and related development for enterprise data models - Identifying the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC - Working proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills and attributes for success include: - Strong communication, presentation, and team-building skills - Experience in executing and managing research and analysis of companies and markets - BE/BTech/MCA/MBA with 8 - 12 years of industry experience with machine learning, visualization, data science, and related offerings - At least around 4-8 years of experience in BI and Analytics - Ability to do end-to-end data solutions from analysis, mapping, profiling, ETL architecture, and data modeling - Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality - Good experience using CA Erwin or other similar modeling tools - Strong knowledge of relational and dimensional data modeling concepts - Experience in data management analysis - Experience with unstructured data is an added advantage - Ability to effectively visualize and communicate analysis results - Experience with big data and cloud preferred - Experience, interest, and adaptability to working in an Agile delivery environment. Ideally, you'll also have: - Good exposure to any ETL tools - Good to have knowledge about P&C insurance - Must have led a team size of at least 4 members - Experience in Insurance and Banking domain - Prior client-facing skills, self-motivated, and collaborative. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As a Data Migration Developer, you will play a crucial role in managing the integration and migration of FIS Commercial Lending Solution (CLS) as part of a strategic transformation initiative for the Back Office. Your main responsibilities will include contributing to migration activities from legacy lending platforms to FIS CLS, collaborating with business analysts to implement tools for data collection, preparation, mapping, and loading, testing migration loading to CLS in a controlled environment, and supporting end-to-end dress rehearsals and production migration. To excel in this role, you should have a minimum of 8 years of relevant experience in software engineering, with at least 4 years in ETL tools, and ideally in data migration of financial data. The ideal candidate will be an experienced technology professional with a proven track record of solving complex technical challenges. Your technical skills should encompass data migration & ETL, strong experience with end-to-end data migration processes, hands-on experience with ETL tools such as Informatica, Talend, SSIS, a strong command of SQL, system integration & API work, testing & quality assurance, technical documentation & change management, DevOps & automation, and security & compliance. Additionally, functional skills such as knowledge of the commercial loan lifecycle or basic banking experience, strong hands-on experience with FIS Commercial Lending Solutions, experience with banking core systems and integrations, a good understanding of SDLC and Agile Scrum practices, and soft skills including leadership, problem-solving, communication, and collaboration will be essential for success in this role. In summary, as a Data Migration Developer, you will be at the forefront of ensuring a seamless integration and migration process for FIS CLS, leveraging your technical expertise, problem-solving skills, and collaborative approach to drive the success of this strategic transformation initiative for the Back Office.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You will be responsible for developing, configuring, and maintaining SAS Fraud Management (SFM) solutions to detect and prevent fraud. Your role will involve integrating SAS SFM with internal and external systems to ensure seamless data flow and analysis. Additionally, you will design, implement, and fine-tune fraud detection models to effectively identify fraudulent transactions. Customizing rules, alerts, and workflows within SAS SFM to align with organizational requirements will be a key aspect of your responsibilities. Analyzing large datasets to identify fraud patterns and trends, ensuring accurate and actionable insights, is a crucial part of this role. You will also be required to monitor system performance and optimize SAS SFM applications for efficiency and scalability. Thorough testing, including unit, system, and UAT, will be conducted by you to ensure that the solutions meet business needs. It is essential to ensure that the solutions comply with regulatory requirements and organizational standards. Moreover, creating and maintaining detailed technical and functional documentation for all SFM implementations will be part of your routine tasks. Collaborating closely with business, IT teams, and stakeholders to comprehend fraud management needs and deliver appropriate solutions is another key responsibility. Skills Required: - Proficiency in SAS technologies, especially SAS Fraud Management (SFM). - Strong command over Base SAS, SAS Macros, SAS Enterprise Guide, and SAS Visual Analytics. - Experience in data integration utilizing ETL tools and methods. - Knowledge of database systems such as SQL, Oracle, and PostgreSQL, along with advanced query skills. - Solid understanding of fraud detection methodologies, rules, and algorithms. Educational Qualification: - IT Graduates (B.E. (IT), B.Sc.(IT), B. Tech, BCA, MCA, M.Tech) This is a permanent position requiring an experienced individual in the field. The role was posted on January 20, 2025.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You have a great opportunity to join our team as a Financial Systems Developer with a focus on maintaining and enhancing Planning Analytics (TM1) models. In this role, you will also be involved in testing, monitoring performance, addressing user support issues, and building financial and managerial reports using business intelligence reporting tools. We are looking for someone with knowledge of databases, ETL tools, and a basic understanding of business and financial processes. Experience in media and advertising is considered a plus. As a Financial Systems Developer, your main responsibilities will include building and maintaining TM1 Rules, Turbo Integrator processes, cubes, dimensions, and automation of data loads. You will update and troubleshoot TM1 security models, collaborate with our Planning Analytics Systems Manager, and work with business users and system administrators to develop and refine solutions. Additionally, you will assist in integrating Planning Analytics as a data source for business analytics reporting, maintain data models, test and validate results, act as the first line of support for user issues, and monitor system performance. To qualify for this role, you should have experience as a hands-on developer of TM1 (Planning Analytics), intermediate to advanced Excel knowledge, and familiarity with Tableau/Power BI is a plus. You should possess solid business judgment, an understanding of finance and financial processes, experience with databases and ETL tools, the ability to compare and validate data sets, absorb and present complex ideas quickly and accurately, gather and analyze end user requirements, and adhere to tight deadlines. If you are looking to be part of a dynamic team in the media and advertising industry, this is the perfect opportunity for you. Join us in our mission to provide stellar products and services in areas of Creative Services, Technology, Marketing Science, Business Support Services, Market Research, and Media Services.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Tech Lead at Carelon Global Solutions India, you will play a crucial role in defining the solution architecture for applications while ensuring alignment with enterprise standards and strategic direction. Your responsibilities will involve serving as a technical subject matter expert for multiple technologies, adhering to code standards and policies, and supporting various application development projects. To excel in this role, you will need to have a BE/MCA qualification along with over 10 years of IT experience, including a deep understanding of Elevance Health applications/platforms such as WGS, Facets, SPS, Data platforms, and Member/Provider communications. Additionally, familiarity with ETL tools, database concepts, data modeling, multi-cloud environments (AWS, Azure, GCP), data security protocols, ERP/CRM tools, and integration technologies like API management, SOA, Microservices, and Kafka topics will be essential. At Carelon Global Solutions, we are committed to improving lives and communities by simplifying healthcare. Our values of Leadership, Community, Integrity, Agility, and Diversity guide us in achieving our mission. Joining our team means entering a world of limitless opportunities where growth, well-being, purpose, and belonging are nurtured. We offer extensive learning and development opportunities, foster an innovative and creative culture, prioritize holistic well-being, and provide competitive health and medical insurance coverage. As an equal opportunity employer, we celebrate diversity and inclusivity in our workforce and work styles. If you require any accommodations due to a disability, we encourage you to request the Reasonable Accommodation Request Form during the application process. Join us at Carelon, where your potential is limitless and your contributions make a meaningful impact on the healthcare industry.,

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Gurugram

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Celonis COE Analyst, you will be responsible for designing, developing, and maintaining robust data pipelines to extract, transform, and load data into the Celonis platform from various source systems. Your role will involve creating and optimizing data models within Celonis to support process intelligence activities, ensuring accurate representation of business processes and enabling in-depth analysis. Additionally, you will collaborate with IT and other teams to integrate data from different sources into the Celonis platform, ensuring seamless and efficient data flow. Monitoring and optimizing the performance of Celonis data models and pipelines will be a key aspect of your responsibilities. You will identify bottlenecks and implement solutions to improve efficiency and scalability. Implementing data validation and cleansing processes to ensure data accuracy and integrity for process mining and analysis will also be part of your duties. Your role will involve working closely with data analysts and stakeholders to understand their requirements, providing technical support, and training on the use of Celonis for data-driven decision-making. Maintaining comprehensive documentation of data models, ETL processes, and data pipelines will be essential for transparency and ease of maintenance. You will also need to stay updated with the latest developments in process intelligence, data engineering, and Celonis technologies to propose and implement best practices for improving the overall data architecture. Qualifications: - Education: Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. - Experience: Minimum 4+ years of experience as a Data Engineer, with a focus on ETL processes, data modeling, and pipeline optimization. 2+ years of hands-on experience with Celonis is required, including data integration, model building, building views & action flows. - Technical Skills: Proficiency in SQL, experience with database management systems, knowledge of ETL tools, and data warehousing concepts. Experience with scripting languages like Python and JavaScript for data manipulation and automation is necessary. Strong domain knowledge of at least one module in SAP is preferred. - Soft Skills: Strong analytical and problem-solving skills, excellent communication and collaboration abilities, and the ability to manage multiple tasks and projects in a fast-paced environment are essential for this role.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have proven experience in leading and delivering life insurance policy conversion projects or similar initiatives. Your role will involve overseeing the entire data conversion lifecycle, from planning and design to execution, testing, and go-live, ensuring projects stay on track and within budget. Working closely with client stakeholders to understand their data requirements, facilitate design sessions, and address their concerns throughout the project lifecycle is crucial. Your strong understanding of data conversion processes, data structures, and mapping techniques used in life insurance will be essential. You will be responsible for managing and mentoring a team of data conversion consultants, providing technical guidance, coaching, and performance management. Implementing and maintaining data quality controls and validation processes to ensure the accuracy and completeness of converted data is a key aspect of the role. Effective communication of project status, risks, and issues to stakeholders, both internally and externally, is required. You will need to identify and resolve technical issues and challenges related to data conversion while managing risks, blockers, and challenges proactively to ensure timely delivery on commitments and maintaining full stakeholder transparency. Identifying and implementing process improvements to enhance efficiency and effectiveness of data conversion projects will be part of your responsibilities. The role also involves identifying process and procedure enhancements to drive efficiency and improve customer satisfaction. Your experience should include configuring new product launches in the Life and Annuity Domain, enhancing existing products, and eagerness to learn new digital technologies and cloud platforms while suggesting better ways of doing things. Experience in SDLC methodologies is also expected. You should have at least 15+ years of work experience in successfully delivering complex policy conversion projects end to end. A strong understanding of data conversion processes, data structures, data mapping, ETL tools, and database management systems is required. Knowledge of the Life & Annuity Insurance domain in a TPA environment is a must. Additionally, familiarity with project management methodologies (Agile/Waterfall), excellent written and verbal communication skills, the ability to effectively communicate with both technical and non-technical audiences, and a strong understanding of data quality principles and best practices are essential. Experience with life insurance systems, regulations, and industry standards, as well as project management methodologies and tools, will be beneficial.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for developing and maintaining data pipelines and ETL processes in Teradata and Snowflake databases. Your main focus will be on optimizing database performance, ensuring data quality, designing data models and schemas, and collaborating with Power BI developers to provide data solutions. Additionally, you will be validating data within the databases. To excel in this role, you must possess expertise in Teradata and Snowflake database development, strong SQL and data warehousing skills, exposure to PowerBI, hands-on experience with ETL tools and processes, and knowledge of data modeling and schema design. The must-have technologies for this position include Teradata, Snowflake, SQL, and ETL tools. It would be beneficial to have skills in Python, Cloud Data platforms, or any other cloud technologies. This position requires a minimum of 5-6 years of relevant experience. Please note that after the initial L1 interview, if you are selected, you may be required to come to any of the mentioned locations (Chennai, Bengaluru, Hyderabad) for the L2 interview. The shift timing for this role is from 2 pm to 11 pm.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an Odoo Developer at CGI, you will be responsible for developing and customizing solutions on the Odoo Platform v15+. With a minimum of 4 years of experience, you will have a deep understanding of Odoo modules, architecture, APIs, and the ability to integrate Odoo with other systems and data sources. You will work on Odoo deployments with more than 1000 logged-in users, ensuring scalability for a large number of users and transactions. Proficiency in Python is essential, and experience with other programming languages such as Java or Scala is a plus. In this role, you will have the opportunity to analyze and interpret complex data sets, utilize data visualization tools like Superset, and work with technologies such as Cassandra and Presto for data analysis and reporting. Your experience with SQL, relational databases like PostgreSQL or MySQL, ETL tools, and data warehousing concepts will be crucial for success. Familiarity with big data technologies like Hadoop and Spark is advantageous. DevSecOps practices are integral to the role, requiring experience with containerization, Docker, Kubernetes clusters, and CI/CD using GitLab. Knowledge of SCRUM and Agile methodologies is essential, as well as proficiency in Linux/Windows operating systems and tools like Jira, GitLab, and Confluence. As a successful candidate, you will demonstrate strong problem-solving and analytical skills, effective communication and collaboration abilities, attention to detail, and a commitment to data quality. You will thrive in a fast-paced, dynamic environment and contribute to turning meaningful insights into action. At CGI, you will be part of a team that values ownership, teamwork, respect, and belonging. You will have the opportunity to shape your career, develop innovative solutions, and access global capabilities while being supported by leaders who care about your well-being and growth. Join CGI as a partner and contribute to one of the largest IT and business consulting services firms in the world.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have proven experience in leading and delivering life insurance policy conversion projects or similar initiatives. It will be your responsibility to oversee the entire data conversion lifecycle, from planning and design to execution, testing, and go-live, ensuring that projects stay on track and within budget. Working closely with client stakeholders will be crucial as you understand their data requirements, facilitate design sessions, and address their concerns throughout the project lifecycle. A strong understanding of data conversion processes, data structures, and mapping techniques used in life insurance is essential for this role. You will be expected to manage and mentor a team of data conversion consultants, providing technical guidance, coaching, and performance management. Implementing and maintaining data quality controls and validation processes to ensure the accuracy and completeness of converted data will also be part of your responsibilities. Effective communication of project status, risks, and issues to stakeholders, both internally and externally, is key. You should be able to identify and resolve technical issues and challenges related to data conversion while managing risks, blockers, and challenges proactively to ensure timely delivery on commitments and maintain full stakeholder transparency. Identifying and implementing process improvements to enhance efficiency and effectiveness of data conversion projects will be encouraged. Moreover, you should have experience in configuring new product launches in the Life and Annuity Domain, as well as enhancing existing products. An eagerness to learn new digital technologies and cloud platforms is highly valued, along with the ability to suggest better ways of doing things in the platform. Familiarity with SDLC methodologies is also required. In terms of experience and skills, you should have at least 15 years of work experience in the successful delivery of complex policy conversion projects end to end. A strong understanding of data conversion processes, data structures, data mapping, ETL tools, and database management systems is essential. Knowledge of the Life & Annuity Insurance domain in a TPA environment is a must, as well as experience with project management methodologies such as Agile/Waterfall. Excellent written and verbal communication skills are necessary, including the ability to effectively communicate with both technical and non-technical audiences. You should also be able to identify and resolve technical issues and challenges related to data conversion, while possessing a strong understanding of data quality principles and best practices. Familiarity with life insurance systems, regulations, and industry standards, as well as experience with project management methodologies and tools, will be beneficial for this role.,

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Salesforce Data Cloud Specialist at our company based in Noida, India, with 5-6 years of experience, you will play a crucial role in implementing and optimizing Salesforce Data Cloud solutions. This role involves integrating large datasets and enhancing data-driven decision-making across marketing, sales, and customer experience functions. Your responsibilities will include implementing and configuring Salesforce Data Cloud for unified customer data profiles, integrating it with Marketing Cloud, Sales Cloud, and external platforms, and designing customer identity resolution strategies. Additionally, you will work on data ingestion pipelines to ensure clean, validated, and structured data flows. Collaborating with cross-functional teams will be essential to enable segmentation, personalization, and advanced analytics. You will also support the activation of unified data across different channels for enhanced customer engagement, monitor performance, troubleshoot issues, and ensure data governance and compliance. To excel in this role, you should have at least 4 years of experience in the Salesforce ecosystem, with a minimum of 2 years specifically in Salesforce Data Cloud. A strong understanding of CDP, Data Modeling, Identity Resolution, and Consent Management is required. Hands-on experience with Salesforce Data Cloud Studio, Data Streams, and Calculated Insights is essential, along with proficiency in APIs, ETL tools, and data integration frameworks. A solid understanding of Salesforce Marketing Cloud, Sales Cloud, and Experience Cloud is also necessary, as well as familiarity with data privacy regulations such as GDPR and CCPA. Possessing Salesforce certifications in Data Cloud or related domains will be a strong advantage. While not mandatory, it would be beneficial to have experience in industry-specific use cases such as Retail, Healthcare, or Finance, as well as strong communication and stakeholder management skills. Join our dynamic team and contribute to driving data-driven decision-making and enhancing customer experiences through Salesforce Data Cloud solutions.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As an Enterprise Snowflake L1/L2 AMS Support, your primary responsibilities will include monitoring and supporting Snowflake data warehouse performance, optimizing queries, and overseeing job execution. You will be tasked with troubleshooting data loading failures, managing access control, and addressing role-based security issues. Additionally, you will be expected to carry out patching, software upgrades, and security compliance checks while upholding SLA commitments for query execution and system performance. To excel in this role, you should possess 2-5 years of experience working with Snowflake architecture, SQL scripting, and query optimization. It would be beneficial to have familiarity with ETL tools such as Talend, Matillion, and Alteryx for seamless Snowflake integration.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an ICM developer within Global Incentive Compensations (GIC), your primary responsibility will be to develop, maintain, and test systems used for compensating the Salesforce sales organization and tools for internal teams. We are looking for a talented and motivated individual with hands-on experience in Incentive Compensation Management (ICM) to join our team. The ideal candidate should possess both strong technical skills and functional expertise to design, evaluate, and develop systems and processes that meet business user requirements and comply with IT and Finance audit guidelines. Expertise in Salesforce Data Management, including creating scenarios and seeding data, will be crucial for this role. Your role will include managing and optimizing the ICM platform, customization, configuration, and integration of processes, as well as data management tasks within Salesforce. You should have a good understanding of Salesforce administration and the ability to design, implement, and maintain ICM solutions that drive effective sales compensation strategies. Key Responsibilities: ICM (Incentive Compensation Management): - Understand incentive compensation models, commission structures, bonus programs, and sales performance metrics within Salesforce. - Strong experience with compensation systems such as Spiff, Everstage, Captivate IQ, Varicent, or Xactly. - Develop and test processes within Xactly Connect or other ETL tools for data movement. - Provide regular analysis and insights into compensation plans, recommending improvements based on system data. - ICM implementation experience. Salesforce Data Management: - Conduct data imports/exports and cleanup activities as needed. - Experience in Salesforce Administration, SOQL, Apex, JavaScript (Lightning), and VisualForce is preferable. - Ensure data integrity and enforce best practices for data governance in Salesforce. Collaboration and Communication: - Provide regular updates and reports to stakeholders about ICM performance. - Stay current on best practices and emerging technologies in Salesforce administration and ICM management. Skills and Experience: - Minimum of 3 years of experience required, 3 to 5 years preferred. - Degree in Computer Science, Information Systems, or equivalent industry experience in application and software development. - Understanding of database infrastructure and concepts. - Experience developing customer-facing interfaces. - Good communication skills. - Problem-solving ability for high-level software and application issues. - Familiarity with Agile (scrum) framework is preferred.,

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

karnataka

On-site

You should possess a Bachelors or Masters degree in computer science, Engineering, or a related field. With over 10 years of software development experience, you should have a strong focus on JavaScript and Groovy, including at least 3 years as a Pricefx Solution Architect. Additionally, you must be a certified Pricefx configuration engineer with a minimum of 3 years of active experience in this role, demonstrating a proven track record in the solution architect role by implementing a minimum of 2 Pricefx capabilities. Your expertise should include proficiency in data integration techniques such as APIs and ETL tools, along with a solid understanding of REST and SOAP. You should have demonstrated experience in designing and delivering complex enterprise applications in large-scale environments, emphasizing microservices architecture, RESTful API design, pricing strategies, and analytics. Your technical skills should showcase a strong understanding of enterprise architecture frameworks and in-depth knowledge of the Pricefx Platform, including Price Builder, Deal Manager, Rebate Manager, Quote configurator, and Channel Manager. Proficiency in Pricefx Configuration Studio (CS) and the Pricefx promotion engine is essential. You should also excel in JavaScript for UI extensions and dynamic page behavior, possess expertise in Groovy scripting within Pricefx for logic implementation and customization, and have experience in integrating Pricefx with ERPs using ETL tools, RESTful APIs, SOAP, and custom connectors. A strong knowledge of security best practices is crucial. In terms of soft skills, you must have excellent communication skills enabling you to convey technical concepts to both technical and non-technical stakeholders effectively. Proven leadership experience is a must, including a track record of leading architecture initiatives and mentoring engineering teams. A problem-solving mindset is key, with an ability to innovate and propose creative solutions. Certifications required for this role include Pricefx configuration certification and Solution Architect certification. Any additional relevant certifications such as AWS Certified Solutions Architect or TOGAF are considered a plus. This position is open for candidates located across PAN India.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced Salesforce Developer with 4-7 years of experience, you will be responsible for meeting with clients to determine business, functional, and technical requirements. Your key responsibilities will include participating in application design, configuration, testing, and deployment. You will be expected to perform configuration and customization of the Salesforce.com platform and engage in efforts to develop and execute testing, training, and documentation. In this role, you will actively participate in the sales cycle as needed, including solution definition, pre-sales activities, estimating, and project planning. You should be willing to be hands-on in producing tangible deliverables such as requirement specifications, design deliverables, status reports, and project plans. Additionally, you will proactively engage in continuous improvement efforts for application design, support, and practice development. You will provide technical assistance and end-user troubleshooting for bug fixes, enhancements, and how-to assistance. It will be part of your responsibilities to perform regular reviews on implementation done by less experienced developers and offer feedback and suggestions for improvement. You will also mentor junior and mid-level developers of the team and be able to designate tasks to team members effectively. As a senior member of the team, you should have the ability to set up a development environment independently and mentor a team of junior developers. You will need to communicate independently with both client technical teams and business owners during the design and implementation phases of projects. Key Skills and Qualifications: - 3+ years of experience working on Salesforce platforms - Hands-on design and development of Lightning Web Components - Experience with Velocity Communication Cloud - Hands-on experience with Vlocity DataRaptor, Vlocity Cards, Vlocity OmniScript, VlocityDX, and Vlocity Integration Procedures - Vlocity CPQ certification is preferred - Experience working on CRM projects for middle market and enterprise size companies - Working knowledge of complex business systems integration and object-oriented design patterns - Proficiency in Force.com Platform technologies such as Apex, LWC, SOQL, and Unit Testing - Familiarity with core web technologies including HTML5, JavaScript, and jQuery - Knowledge of relational databases, data modeling, and ETL tools - Experience with web services (REST & SOAP, JSON & XML) - Familiarity with Agile development methodologies like SCRUM - Excellent organizational, verbal, and written communication skills If you are a self-motivated individual with the required experience and skills, we invite you to join our team and contribute to our projects.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Salesforce Developer at Techcoopers, you will be a crucial member of our team, focusing on enhancing the efficiency of our CRM system. Your primary responsibilities will include designing, developing, testing, and implementing customizations, applications, extensions, and integrations. Collaborating with a group of engineers and working closely with Sales, Customer Success, and Marketing teams, you will translate business requirements into scalable CRM solutions that drive the growth and success of Techcoopers. Your key responsibilities will involve meeting with clients to gather business and technical requirements, contributing to application design, configuration, testing, and deployment. You will configure and customize the Salesforce.com platform, participate in testing, training, and documentation efforts, and produce various project deliverables. Additionally, you will provide operational support, bug fixes, enhancements, and continuous improvement initiatives for the application design and development. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science or a related technical field, possess excellent organizational and communication skills, and have direct experience working on CRM projects for medium to large companies. Your expertise should include complex business systems integration, object-oriented design patterns, and proficiency in core web technologies such as HTML5, JavaScript, and LWC/Aura framework. You should also have advanced knowledge of Salesforce functionalities like permissions, roles, reports, and dashboards, as well as software engineering skills with Force.com Platform (Apex, Visualforce, SOQL, Unit Testing). Experience with Java SE & EE, relational databases, Agile methodologies, web services, and a passion for adopting new technologies will be beneficial for this role. If you are someone who enjoys working in a fast-paced, collaborative environment and is keen on expanding your professional network, this opportunity at Techcoopers could be the perfect fit for you. If you meet the specified requirements and are interested in this position, please share your resume along with compensation and notice period details to career@techcoopers.com.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies