Jobs
Interviews

5357 Informatica Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As an Associate Architect (IND) at Elevance Health, you will be responsible for designing and implementing scalable, high-performance ETL solutions for data ingestion, transformation, and loading. You will define and maintain data architecture standards, best practices, and governance policies while collaborating with data engineers, analysts, and business stakeholders to understand data requirements. Your role will involve optimizing existing ETL pipelines for performance, reliability, and scalability, ensuring data quality, consistency, and security across all data flows. In this position, you will lead the evaluation and selection of ETL tools and technologies, providing technical leadership and mentorship to junior data engineers. Additionally, you will be expected to document data flows, architecture diagrams, and technical specifications. It would be beneficial to have experience with Snowflake and Oracle. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with at least 8 years of experience in data engineering or ETL development. Strong expertise in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or similar is essential, as well as proficiency in SQL and experience with relational and NoSQL databases. Experience with cloud platforms like AWS, Azure, or Google Cloud, familiarity with data modeling, data warehousing, and big data technologies are also required. The ideal candidate will possess strong problem-solving and communication skills, along with good business communication skills. You should be committed, accountable, and able to communicate status to stakeholders in a timely manner. Collaboration and leadership skills are vital for this role, as you will be working with global teams. At Carelon, we promise a world of limitless opportunities to our associates, fostering an environment that promotes growth, well-being, purpose, and a sense of belonging. Our focus on learning and development, innovative culture, comprehensive rewards, and competitive benefits make Carelon an equal opportunity employer dedicated to delivering the best results for our customers. If you require reasonable accommodation during the application process, please request the Reasonable Accommodation Request Form. This is a full-time position based in Bangalore.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer at NTT DATA in Chennai, Tamil Nadu (IN-TN), India, you will be responsible for designing, developing, and maintaining robust RESTful APIs and Microservices using technologies such as Java, Spring, Spring Boot, and Kafka. You will have hands-on experience with Event/Listener messaging frameworks like Kafka and designing high-performance microservices to handle high Transaction Per Second traffic. Additionally, you will work on cloud infrastructure across platforms like Azure, Kubernetes tools and services, and have knowledge of CD processes and tools such as GitHub, Jenkins, and uDeploy. To excel in this role, you should have a Bachelor's degree in computer science, Engineering, or Equivalent with at least 6-9 years of experience in Java, Springboot, Oracle, Kubernetes, Kafka, and Azure/AWS cloud technologies. You must possess a strong understanding of data governance principles, modern programming languages, and frameworks. Moreover, you should be well-versed in ITIL processes like Incident management and Change management and have experience leading or mentoring scrum teams. Key skills required for this role include hands-on experience in Java, Spring, Spring Boot, Event/Listener messaging frameworks, designing robust RESTful APIs, working with Hashicorp Vault, Terraform, and Packer, and managing cloud infrastructure across platforms like AWS, Azure, or Google Cloud. You should also be familiar with container-based development, EDA solutions like Kafka/MQ, database concepts, OAuth 2.0 framework, and Microservices architecture. As part of the general expectations, you must have excellent communication skills, be willing to work in a flexible shift from 10:30 AM to 8:30 PM at the Client Location Ramanujam IT park, Taramani, Chennai. Full remote work is not an option, and a return to office is expected by 2025. Join NTT DATA, a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts across more than 50 countries, NTT DATA offers business and technology consulting, data and artificial intelligence, industry solutions, and digital infrastructure services. Become part of our global network to drive innovation and transformation for a sustainable digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Job Title : Data Management Analyst Short Description : Promote & ensure high-quality, fit-for-purpose data as part of Chief Data Office, shaping industry-leading data management strategies Posting Description The successful candidate will join a team focused on delivering business impact through transformation of Data Quality management. The role will require working with stakeholders across the CIB, other LOBs and Corporate Functions to ensure fit-for-purpose data, looking to leverage best practice across the industry and other J.P. Morgan business units. This role provides an outstanding opportunity for the selected candidate to join a high-profile team at JPMorgan Chase and take a partnership role, supporting one of the Bank’s most significant change programs. Job Responsibilities Demonstrate good understanding of data governance, data quality & data lineage Implement and support Data Quality (DQ) practices across the CIB Govern and triage DQ Issues as it progresses through the lifecycle Accelerate data quality issue level root cause analysis and resolution through effective program governance, process mapping and data deep-dives Discuss and agree technical resolutions with technology teams to remediate DQ issues Discover and document data-lineage to trace the end-to-end data journey from point of creation to consumption Set up data profiling and DQ rules leveraging DQ tools like Collibra, Informatica and other emerging tools Leverage productivity tools such as Alteryx and visualization tools such as Tableau to analyze large dataset to draw inferences Collaborate & Build strong partnerships with Business stakeholders & Technology teams to support data quality efforts Demonstrates Teamwork by collaborating with others to integrate ideas & achieve common goals Required Qualification, Capabilities & Skills 5+ years’ experience in Financial Services with Data Quality / Data Lineage / Business Analysis background Excellent analytical and problem-solving skills. Capacity to think laterally and convey an understanding of the big picture Proficiency in manipulating and analyzing large data sets in Excel (pivots, formulas, charts, etc.) Excellent communication, presentation (both oral and written) & influencing skills - candidate will be dealing with stakeholders of varying degrees of seniority across Corporate/Firmwide, Finance, Risk, Compliance, Operations and Technology teams Self-starter, able to work autonomously, with strong time management skills; efficient at multi-tasking and able to work under pressure to deliver multiple business demands on-time, to a high standard Basic understanding of the company's business practices and familiarity with the company's products and services Preferred Qualification, Capabilities & Skills Experience in CIB and CDO/Data Quality organization Experience in querying database/data stores for data analysis. Experience in producing PowerPoint presentations for senior audiences Exposure to tools/technologies such as Alteryx/Tableau/SQL/Informatica/Collibra JPMorgan Chase is an Equal Opportunity and Affirmative Action Employer, M/F/D/V About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As an Informatica Production Support associate, you will be responsible for managing and providing support for Informatica suite applications. Your key responsibilities will include basic operation support management such as handling incidents, fulfilling requests, monitoring application batches, resolving abends, and ensuring adherence to critical business SLAs. You will collaborate with various application, business, and infrastructure groups to troubleshoot application issues and perform functionality and impact analysis for new projects, enhancements, and upgrades. Additionally, you will be required to perform bug fixes and minor enhancements on demand. Your role will involve supporting management activities such as incident management, request fulfillment, file monitoring, ad hoc processing, defining best practices, supporting audits, and developing mitigation plans. The main activities you will undertake include ensuring Data Warehouse availability, meeting business SLAs, implementing minor bug fixes and enhancements to maintain application stability, providing architectural consultation, estimations for upgrades and migrations, as well as contributing to MassMutual transformation initiatives. Overall, your primary objectives in this role will be to ensure the availability of the Data Warehouse, meet business SLAs, provide architectural consultation, estimate upgrades and migrations, and contribute to MassMutual transformation initiatives.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Technical Consultant / Technical Architect with Fund Accounting experience and proficiency in Oracle and Informatica, your primary responsibility will be to collaborate with Delivery Managers, System/Business Analysts, and other subject matter experts to comprehend project requirements. Your role will involve designing solutions, providing effort estimation for new projects/proposals, and developing technical specifications and unit test cases for the interfaces under development. You will be expected to establish and implement standards, procedures, and best practices for data maintenance, reconciliation, and exception management. Your technical leadership skills will be crucial in proposing solutions, estimating projects, and guiding/mentoring junior team members in developing solutions on the GFDR platform. Key Requirements: - 10-12 years of experience in technical leadership within data warehousing and Business Intelligence fields - Proficiency in Oracle SQL/PLSQL and Stored Procedures - Familiarity with Source Control Tools, preferably Clear Case - Sound understanding of Data Warehouse, Datamart, and ODS concepts - Experience in UNIX and PERL scripting - Proficiency in standard ETL tools like Informatica Power Centre - Technical leadership in Eagle, Oracle, Unix Scripting, Perl, and job scheduling tools like Autosys/Control - Strong knowledge of data modeling, data normalization, and performance optimization techniques - Exposure to fund accounting concepts/systems and master data management is desirable - Ability to work collaboratively with cross-functional teams and provide guidance to junior team members - Excellent interpersonal and communication skills - Willingness to work both in development and production support activities Industry: IT/Computers-Software Role: Technical Architect Key Skills: Oracle, PL/SQL, Informatica, Autosys/Control, Fund Accounting, Eagle Education: B.E/B.Tech Email ID: jobs@augustainfotech.com If you meet the specified requirements and are passionate about delivering innovative solutions in a collaborative environment, we encourage you to apply for this exciting opportunity.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be reporting to the Senior Director of SaaS and will be responsible for customizing, developing, and supporting solutions on the Salesforce and ServiceNow platforms. The ideal candidate should possess a strong understanding of Salesforce.com and ServiceNow platforms, with basic to intermediate knowledge of integrations and security. Having an interest and ability to understand problems, design solutions, and execute critical paths is essential. Additionally, exceptional technical, analytical, and problem-solving skills are required, along with the ability to interact with all levels of the organization. We are looking for a self-starter with a proactive approach who can identify and suggest process improvements. Your responsibilities will include: - Daily administration of the ServiceNow system, such as implementing approved changes to forms, tables, reports, and workflows - Creating and customizing reports, homepages, and dashboards in ServiceNow - Ensuring the ServiceNow platform remains up-to-date by testing and installing updates, patches, and new releases - Designing and developing advanced ServiceNow customizations - Troubleshooting multiple integrations with ServiceNow and Rally - Managing ServiceNow security by overseeing roles and access control lists - Providing training to personnel on ServiceNow usage and processes, including creating supporting documentation - Collaborating with end-users to resolve support issues within ServiceNow - Conducting code reviews and developing, configuring, testing, and deploying solutions on the Salesforce platform - Configuring, designing, ensuring functionality, and providing end-user support on the Force.com platform - Implementing solutions in an agile environment, delivering high-quality code and configurations - Managing workflows, process builders, assignment rules, email templates, and other features - Handling data imports and exports, customizing objects, fields, reports, and 3rd party apps - Managing users, profiles, permission sets, security, and other administrative tasks - Leading testing of various functionalities, creating test data, test plans, and conducting feature testing - Demonstrating solutions to users, providing training, and documenting as necessary - Offering ongoing support and system administration to quickly resolve production issues - Mapping functional requirements to Salesforce.com features and functionalities - Implementing change control and best practices for system maintenance, configuration, development, testing, and data integrity - Utilizing Sales Cloud, ServiceNow, and Salesforce Community hands-on experience - Having a programming background to develop custom code using Visualforce, Apex, Lightning, and JavaScript as needed - Knowing when to use out-of-the-box functionality versus custom code We are looking for candidates who have: - Excellent listening, analytical, organizational, and time management skills - Strong written and oral communication skills, demonstrating diplomacy and professionalism - A strong work ethic, customer service mentality, and ability to work under pressure - Team player mindset with the ability to work cross-functionally, be self-driven, and motivated - Capability to work independently, lead projects of moderate complexity, and identify areas for process improvement - Creativity, problem-solving skills, and the ability to develop effective relationships with various stakeholders - Prioritization skills to meet deadlines in a fast-paced environment and embrace change - Bachelor's Degree in Computer Science or a related technical field, or equivalent experience - 3+ years of hands-on experience developing Salesforce and ServiceNow - Proficiency in Salesforce and ServiceNow programmatic features - Ability to dig into data, surface actionable insights, demonstrate sound judgment and decision-making skills - Experience with Data Loader and other data loading tools, MS Excel, and Database modeling - Additional knowledge in ServiceNow, Community, CPQ, Marketo, and other integrations is a plus - Proactivity, ability to manage changing priorities and workload efficiently Additional Information: - The recruitment process includes online assessments, which will be sent via email - The position is based in Pune office.,

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Azurity Pharmaceuticals is a privately held, specialty pharmaceutical company that focuses on innovative products that meet the needs of underserved patients. As an industry leader in providing unique, accessible, and high-quality medications, Azurity leverages its integrated capabilities and vast partner network to continually expand its broad commercial product portfolio and robust late-stage pipeline. The company’s patient-centric products span the cardiovascular, neurology, endocrinology, gastro-intestinal, institutional, and orphan markets, and have benefited millions of patients. For more information, visit www.azurity.com. Azurity Pharmaceuticals is proud to be an inclusive workplace and an Equal Opportunity Employer. Azurity's success is attributable to our incredibly talented, dedicated team that focuses on benefiting the lives of patients by bringing the best science and commitment to quality into everything that we do. We seek highly motivated individuals with the dedication, integrity, and creative spirit needed to thrive in our organization. Brief Team/department Description Our Digital team at Azurity is building new capabilities utilizing cutting-edge Salesforce systems. We are looking for a dynamic, change inspired, Individual self-driven hands-on Team Member. The Salesforce Developer – Life Sciences is responsible for designing, developing, and optimizing Salesforce solutions to support Azurity pharma. This role focuses on customizing Salesforce platform ensuring seamless HCP/HCO engagement, sales rep support, regulatory compliance, and commercial operations. The ideal candidate will collaborate with onshore architects, business analysts, and stakeholders to develop scalable, high-performing Salesforce solutions while maintaining compliance with HIPAA, GDPR, Sunshine Act, and FDA regulations. Principle Responsibilities Salesforce Development & Customization Develop and enhance Life Sciences-specific CRM functionalities in Salesforce platform and Sales Cloud to support HCP/HCO engagement, Commercial Operations, KOL (Key Opinion Leader) management, and field rep journeys. Customize Salesforce objects, Apex triggers, Lightning Web Components (LWC), Visualforce pages, and declarative automation (Flows, Process Builder). Implement consent tracking, call planning, sample management, and omnichannel engagement workflows for field reps, MSLs (Medical Science Liaisons), and sales teams. Ensure territory management, commercial operations, and compliance tracking are seamlessly integrated into Salesforce. Performance Optimization & Security Optimize Apex code, SOQL queries, and Lightning Web Components for scalability and high performance. Implement role-based security, audit logs, and field-level encryption to maintain compliance with HIPAA, GDPR, and FDA regulations. Conduct code reviews, unit testing, and debugging to ensure high-quality solution delivery. Collaboration & Agile Development Work closely with onshore architects, business analysts, and product owners to gather requirements and translate them into technical solutions. Participate in scrum meetings, sprint planning, and UAT (User Acceptance Testing) as part of an Agile team. Provide technical documentation and deployment support for Salesforce enhancements. Continuous Improvement & Best Practices Stay updated with Salesforce releases, Life Sciences Cloud advancements, and best practices. Implement Salesforce DevOps methodologies, using tools like Gearset, Copado, Jenkins for CI/CD automation. Preferred Skills And Experience 3+ years of experience in Salesforce development, preferably in Life Sciences or Healthcare. Expertise in Salesforce Health Cloud, Sales Cloud, or Veeva CRM Proficiency in Apex, Lightning Web Components (LWC), Visualforce, SOQL, and API development. Hands-on experience with Salesforce APIs (REST, SOAP), middleware tools (MuleSoft, Informatica, Boomi), and data migration. Hands-on experience with Salesforce DevOps tools (Gearset, Copado, Jenkins) for release management. Understanding of HIPAA, GDPR, Sunshine Act, FDA regulations, and data security best practices. Experience working in a global offshore-onshore collaboration model using Agile methodologies. Salesforce Platform Developer I & II certifications required. Excellent problem-solving and communication skills in a remote, global team setup. By applying for this role, you confirm that you are mentally and physically capable of fulfilling the job responsibilities detailed in the job description without any restrictions. If you have any concerns or even the slightest disability that may affect your ability to perform the job, please inform HR in advance.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Delhi, India

Remote

JOB_POSTING-3-72796-3 Job Description Role Title: VP, Data Engineering Tech Lead (L12) Company Overview COMPANY OVERVIEW: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #5 among India’s Best Companies to Work for 2023, #21 under LinkedIn Top Companies in India list, and received Top 25 BFSI recognition from Great Place To Work India. We have been ranked Top 5 among India’s Best Workplaces in Diversity, Equity, and Inclusion, and Top 10 among India’s Best Workplaces for Women in 2022. We offer 100% Work from Home flexibility for all our Functional employees and provide some of the best-in-class Employee Benefits and Programs catering to work-life balance and overall well-being. In addition to this, we also have Regional Engagement Hubs across India and a co-working space in Bangalore Organizational Overview Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts. Responsible for the SYF public cloud platform & services. Govern health, performance, capacity, and costs of resources and ensure adherence to service levels Build well defined processes for cloud application development and service enablement. Role Summary/Purpose We are seeking a highly skilled Cloud Technical Lead with expertise in Data Engineering who will work in multi-disciplinary environments harnessing data to provide valuable impact for our clients. The Cloud Technical Lead will work closely with technology and functional teams to drive migration of legacy on-premises data systems/platforms to cloud-based solutions. The successful candidate will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc.) and maintain a holistic view across SYF functions to minimize redundancies and optimize the analytics environment. Key Responsibilities Manage end-to-end project lifecycle, including planning, execution, and delivery of cloud-based data engineering projects. Providing guidance on suitable options, designing, and creating data pipeline for the analytical solutions across data lake, data warehouses and cloud implementations. Architect and design robust data pipelines and ETL processes leveraging Ab Initio and Amazon Redshift. Ensure data integration, transformation, and storage process are optimized for scalability and performance in cloud environment. Ensure data security, governance, and compliance in the cloud infrastructure. Provide leadership and guidance to data engineering teams, ensuring best practices are followed. Ensure timely delivery of high-quality solutions in an Agile environment. Required Skills/Knowledge Minimum 10+ years of experience with Bachelor's degree in Computer Science or similar technical field of study or in lieu of a degree 12+ years of relevant experience Minimum 10+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 10+ years of financial services experience Minimum 6+ years of experience working with Data Warehouses/Data Lake/Cloud. 6+ years’ of hards-on programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Working knowledge of Hive, Spark, Kafka and other data lake technologies. Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology Superior decision-making, client relationship, and vendor management skills. Desired Skills/Knowledge Prior work experience in a credit card/banking/fintech company. Experience dealing with sensitive data in a highly regulated environment. Demonstrated implementation of complex and innovative solutions. Agile experience using JIRA or similar Agile tools. Eligibility Criteria Bachelor's degree in Computer Science or similar technical field of study (Masters degree preferred) Minimum 12+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 12+ years of financial services experience Minimum 8+ years of experience working with Oracle Data Warehouses/Data Lake/Cloud 8+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Rigorous data analysis through SQL in Oracle and various Hadoop technologies. Involvement in large scale data analytics migration from on premises to a public cloud Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Work Timings: 3:00 PM IST to 12:00 AM IST (WORK TIMINGS: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details .) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Level / Grade : 12 Job Family Group Information Technology

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

JOB_POSTING-3-72796-2 Job Description Role Title: VP, Data Engineering Tech Lead (L12) Company Overview COMPANY OVERVIEW: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #5 among India’s Best Companies to Work for 2023, #21 under LinkedIn Top Companies in India list, and received Top 25 BFSI recognition from Great Place To Work India. We have been ranked Top 5 among India’s Best Workplaces in Diversity, Equity, and Inclusion, and Top 10 among India’s Best Workplaces for Women in 2022. We offer 100% Work from Home flexibility for all our Functional employees and provide some of the best-in-class Employee Benefits and Programs catering to work-life balance and overall well-being. In addition to this, we also have Regional Engagement Hubs across India and a co-working space in Bangalore Organizational Overview Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts. Responsible for the SYF public cloud platform & services. Govern health, performance, capacity, and costs of resources and ensure adherence to service levels Build well defined processes for cloud application development and service enablement. Role Summary/Purpose We are seeking a highly skilled Cloud Technical Lead with expertise in Data Engineering who will work in multi-disciplinary environments harnessing data to provide valuable impact for our clients. The Cloud Technical Lead will work closely with technology and functional teams to drive migration of legacy on-premises data systems/platforms to cloud-based solutions. The successful candidate will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc.) and maintain a holistic view across SYF functions to minimize redundancies and optimize the analytics environment. Key Responsibilities Manage end-to-end project lifecycle, including planning, execution, and delivery of cloud-based data engineering projects. Providing guidance on suitable options, designing, and creating data pipeline for the analytical solutions across data lake, data warehouses and cloud implementations. Architect and design robust data pipelines and ETL processes leveraging Ab Initio and Amazon Redshift. Ensure data integration, transformation, and storage process are optimized for scalability and performance in cloud environment. Ensure data security, governance, and compliance in the cloud infrastructure. Provide leadership and guidance to data engineering teams, ensuring best practices are followed. Ensure timely delivery of high-quality solutions in an Agile environment. Required Skills/Knowledge Minimum 10+ years of experience with Bachelor's degree in Computer Science or similar technical field of study or in lieu of a degree 12+ years of relevant experience Minimum 10+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 10+ years of financial services experience Minimum 6+ years of experience working with Data Warehouses/Data Lake/Cloud. 6+ years’ of hards-on programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Working knowledge of Hive, Spark, Kafka and other data lake technologies. Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology Superior decision-making, client relationship, and vendor management skills. Desired Skills/Knowledge Prior work experience in a credit card/banking/fintech company. Experience dealing with sensitive data in a highly regulated environment. Demonstrated implementation of complex and innovative solutions. Agile experience using JIRA or similar Agile tools. Eligibility Criteria Bachelor's degree in Computer Science or similar technical field of study (Masters degree preferred) Minimum 12+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 12+ years of financial services experience Minimum 8+ years of experience working with Oracle Data Warehouses/Data Lake/Cloud 8+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Rigorous data analysis through SQL in Oracle and various Hadoop technologies. Involvement in large scale data analytics migration from on premises to a public cloud Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Work Timings: 3:00 PM IST to 12:00 AM IST (WORK TIMINGS: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details .) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Level / Grade : 12 Job Family Group Information Technology

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are seeking great talent to help us build The DNA of tech.® Vishay manufactures one of the world's largest portfolios of discrete semiconductors and passive electronic components that are essential to innovative designs in the automotive, industrial, computing, consumer, telecommunications, military, aerospace, and medical markets. We help the world's most in-demand technologies come to life. Every day our products touch your life and the lives of people across the world, though you likely do not know it. Come join us and help us build The DNA of tech.™ Vishay Intertechnology, Inc. is a Fortune 1,000 Company listed on the NYSE (VSH). Learn more at www.Vishay.com. Do you want to help us build the DNA of tech.? Vishay San Jose, California, Binan Philippines and Pune, India is currently seeking applicants for a Senior Manager of Supply Chain Systems and Processes. What You Will Be Doing Maintain and Optimize Planning Systems: Oversee the regular upkeep, optimization, and troubleshooting of planning systems to ensure efficient and accurate operations. Assist in the configuration design, customization, integration and testing to support evolving business requirements. Master Data Management: Ensure the integrity, accuracy, and consistency of master data across all systems, including coordinating with relevant stakeholders for updates and corrections. MES Experience and integration of data with SAP is desired. Automate Reporting Processes: Develop and implement automation solutions for routine and ad-hoc reporting, improving the accuracy, speed, and efficiency of data delivery. Collaborate with Cross-Functional Teams: Work closely with IT, operations, and business units to identify and address system enhancements and ensure that planning tools meet organizational needs. Troubleshoot and Resolve Issues: Proactively identify system issues and bottlenecks and collaborate with technical teams to implement solutions. Training and Support: Provide training to end-users on best practices for using planning systems and tools, and offer ongoing technical support as needed. Data Governance & Compliance: Ensure that all data management practices comply with internal policies and industry standards, supporting accurate and timely reporting. Performance Monitoring & Reporting: Continuously monitor the performance of planning systems and reporting tools, implementing improvements based on feedback and performance metrics. Enhance Data Visualization: Develop and maintain dashboards and data visualization tools to enable stakeholders to make data-driven decisions quickly and efficiently. Documentation & Knowledge Management: Maintain up-to-date documentation for system configurations, processes, and troubleshooting guides to ensure consistency and ease of use across teams. Managing a team of 4-6 direct reports Technical Qualifications SAP APO Expertise: In-depth experience in implementing, maintaining, and troubleshooting SAP APO modules, especially for demand planning, supply network planning, and production planning. SAP Master Data Management (MDM): Strong knowledge of SAP master data management processes, ensuring accurate and consistent data across the system, including materials, vendors, and customer data. SAP Integration Skills: Experience with integrating SAP APO with other SAP modules – SD/PP/MM (either in SAP ECC, or S/4HANA) and third-party systems to ensure seamless data flow across the enterprise. Advanced Excel Skills: Expertise in using Excel for data manipulation, reporting, and analytics, including knowledge of advanced functions. Data Management Tools: Familiarity with data management tools and platforms, SQL, Informatica, or other ETL (Extract, Transform, Load) and data reporting tools. Industry-Specific Qualifications: Semiconductor Industry Knowledge: A solid understanding of semiconductor manufacturing and supply chain processes, including demand forecasting, production scheduling, inventory management, and lead time considerations. Supply Chain & Production Planning Knowledge: Experience in supply chain management and planning processes in a high-tech or semiconductor environment, including the ability to forecast demand and align production schedules with available capacity. Experience & Skills: Experience with SAP APO Modules: Hands-on experience with specific SAP APO modules such as Demand Planning (DP), Supply Network Planning (SNP), Production Planning and Detailed Scheduling (PP/DS), and Global Available-to-Promise (GATP). Master Data Governance: Proven track record of managing and governing master data to ensure alignment with business processes and compliance with internal standards. Problem-Solving & Troubleshooting: Strong analytical and troubleshooting skills to resolve issues related to SAP APO and data inconsistencies. Project Management Experience: Experience in managing projects, including system upgrades, data migrations, and new module implementations. Should be comfortable with both waterfall and agile methodology. Soft Skills Collaboration & Communication: Excellent collaboration and communication skills to work effectively with cross-functional teams such as IT, operations, business planning, production, and finance. Active listening: Should be an active listener and should be able to interpret and take notes and connect dots. Attention to Detail: Ability to ensure the accuracy and quality of master data and planning systems with minimal supervision. Adaptability: Ability to adapt to evolving technologies and business needs within the semiconductor industry, implementing changes with minimal disruption. What You Will Bring Along Bachelor’s degree in computer science, Engineering, Supply Chain Management, or related fields. Masters preferred. Certifications in SAP APO or related SAP modules (e.g., SAP Certified Application Associate – SAP Advanced Planning and Optimization). Industry Certifications (optional but desirable) such as APICS CPIM (Certified in Production and Inventory Management) or CSCP (Certified Supply Chain Professional) to demonstrate knowledge of supply chain best practices. Additional Desired Qualifications Experience with S/4HANA: Familiarity with SAP S/4HANA, especially in relation to its integration with APO and data management processes. Continuous Improvement Mindset: Experience with Lean, Six Sigma, or other process improvement methodologies to optimize planning and master data management systems. Readiness to travel 25% annually What Can We Offer You For Your Talent Vishay offers a comprehensive suite of benefit programs including health care coverage, financial support programs and other resources designed to help you achieve your personal and professional goals. With us, you'll experience unique career paths, an open and collaborative culture, a stable business that will be there for you, and opportunities to work globally and locally. Do you have the skills we need? Are you ready to power your career as you power the world? If so, apply today. Vishay committed to a workplace free of harassment and unlawful discrimination. We do not engage in discrimination or harassment based on race, color, age, gender, sexual orientation, gender identity and expression, ethnicity or national origin, disability, pregnancy, religion, political affiliation, union membership, covered veteran status, protected genetic information or marital status in hiring and employment practices. It is the policy of Vishay to provide equal employment and advancement opportunities to all colleagues and applicants for employment without regard to race, color, ethnicity, religion, gender, pregnancy/childbirth, age, national origin, sexual orientation, gender identity or expression, disability or perceived disability, genetic information, citizenship, veteran or military status or a person’s relationship or association with a protected veteran, including spouses and other family members, marital or domestic partner status, or any other category protected by federal, state and/or local laws. As an equal opportunity employer, Vishay is committed to a diverse workforce. In order to ensure reasonable accommodation for individuals protected by Section 503 of the Rehabilitation Act of 1973, the Vietnam Veterans' Readjustment Act of 1974, and Title I of the Americans with Disabilities Act of 1990, applicants that require accommodation in the job application process may contact HR.Operations@Vishay.com assistance.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

JOB_POSTING-3-72796-1 Job Description Role Title: VP, Data Engineering Tech Lead (L12) Company Overview COMPANY OVERVIEW: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #5 among India’s Best Companies to Work for 2023, #21 under LinkedIn Top Companies in India list, and received Top 25 BFSI recognition from Great Place To Work India. We have been ranked Top 5 among India’s Best Workplaces in Diversity, Equity, and Inclusion, and Top 10 among India’s Best Workplaces for Women in 2022. We offer 100% Work from Home flexibility for all our Functional employees and provide some of the best-in-class Employee Benefits and Programs catering to work-life balance and overall well-being. In addition to this, we also have Regional Engagement Hubs across India and a co-working space in Bangalore Organizational Overview Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts. Responsible for the SYF public cloud platform & services. Govern health, performance, capacity, and costs of resources and ensure adherence to service levels Build well defined processes for cloud application development and service enablement. Role Summary/Purpose We are seeking a highly skilled Cloud Technical Lead with expertise in Data Engineering who will work in multi-disciplinary environments harnessing data to provide valuable impact for our clients. The Cloud Technical Lead will work closely with technology and functional teams to drive migration of legacy on-premises data systems/platforms to cloud-based solutions. The successful candidate will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc.) and maintain a holistic view across SYF functions to minimize redundancies and optimize the analytics environment. Key Responsibilities Manage end-to-end project lifecycle, including planning, execution, and delivery of cloud-based data engineering projects. Providing guidance on suitable options, designing, and creating data pipeline for the analytical solutions across data lake, data warehouses and cloud implementations. Architect and design robust data pipelines and ETL processes leveraging Ab Initio and Amazon Redshift. Ensure data integration, transformation, and storage process are optimized for scalability and performance in cloud environment. Ensure data security, governance, and compliance in the cloud infrastructure. Provide leadership and guidance to data engineering teams, ensuring best practices are followed. Ensure timely delivery of high-quality solutions in an Agile environment. Required Skills/Knowledge Minimum 10+ years of experience with Bachelor's degree in Computer Science or similar technical field of study or in lieu of a degree 12+ years of relevant experience Minimum 10+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 10+ years of financial services experience Minimum 6+ years of experience working with Data Warehouses/Data Lake/Cloud. 6+ years’ of hards-on programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Working knowledge of Hive, Spark, Kafka and other data lake technologies. Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology Superior decision-making, client relationship, and vendor management skills. Desired Skills/Knowledge Prior work experience in a credit card/banking/fintech company. Experience dealing with sensitive data in a highly regulated environment. Demonstrated implementation of complex and innovative solutions. Agile experience using JIRA or similar Agile tools. Eligibility Criteria Bachelor's degree in Computer Science or similar technical field of study (Masters degree preferred) Minimum 12+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 12+ years of financial services experience Minimum 8+ years of experience working with Oracle Data Warehouses/Data Lake/Cloud 8+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Rigorous data analysis through SQL in Oracle and various Hadoop technologies. Involvement in large scale data analytics migration from on premises to a public cloud Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Work Timings: 3:00 PM IST to 12:00 AM IST (WORK TIMINGS: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details .) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Level / Grade : 12 Job Family Group Information Technology

Posted 1 week ago

Apply

4.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Job Description: We are looking for a Data Engineer with strong hands-on experience in ETL, cloud data platforms, and scripting to work on scalable data integration solutions. Mandatory Skills: SQL Strong expertise in writing optimized queries and procedures. Data Warehousing (DWH) Good understanding of data modeling and warehouse architecture. Shell scripting or Python For automation and custom transformation logic. ETL Tool – Experience with any ETL tool (Talend/Informatica/Datastage etc). DataBricks – Used for data transformation and processing. Azure Data Factory (ADF) – Designing and orchestrating data pipelines. Good to Have: Snowflake – For implementing scalable cloud data warehousing solutions. Azure Ecosystem – General familiarity with Azure services including Data Lake and Storage. Responsibilities: Build and maintain scalable ETL pipelines using Talend, ADF, and DataBricks. Extract, transform, and load data from multiple source systems into Snowflake and/or Azure Data Lake. Interpret technical and functional designs and implement them effectively in the data pipeline. Collaborate with teams to ensure high data quality and performance. Support and guide ETL developers in resolving technical challenges and implementing best practices.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Job Title: Senior Data Engineer (IDMC-CDI) Primary Skills: SQL, IDMC (CDI), ETL Concepts Base Location: Gurugram (Delhi/NCR) Mode of Work: Hybrid Office Work from Home option available for ideal candidates Experience: 5-8 Years About Us We are PDI (Pacific Data Integrators), a leading provider of Data Management, Analytics, and Cloud solutions. As an Informatica Platinum Partner and a reseller of various market-leading products, we help enterprises implement scalable and secure data solutions using tools like Informatica, AWS, Azure, Snowflake, and GCP . We continue to grow as a leader in data integration, data quality, and data security across industries, supporting customers in their digital transformation journeys. Job Overview We are seeking a highly skilled and experienced Informatica IDMC ETL Developer (Lead-Level) with a deep understanding of ETL design, data pipeline development, and Informatica Cloud Data Integration (CDI). The ideal candidate will have a strong background in developing, documenting, testing, and maintaining ETL applications and will take ownership in delivering scalable solutions that meet business and technical requirements. Key Responsibilities Lead the design, development, and deployment of complex ETL solutions using Informatica IDMC . Collaborate with business stakeholders, analysts, and data architects to define technical requirements and translate them into efficient ETL workflows. Develop, document, unit test, and maintain high-quality data pipelines that support both batch and incremental (CDC-based) data processing. Configure and manage: Mappings , Mapping Tasks , and Taskflows Secure Agents , runtime environments , and connection parameters Design and implement robust data transformation logic , including: Cleansing, enrichment, joins, lookups, aggregations Handling hierarchical (JSON, XML) and semi-structured data (REST APIs, Kafka) Use of parameterized mappings and reusable components Build and manage Taskflows with conditional logic, error handling, branching, and scheduling mechanisms. Optimize mapping performance through pushdown optimization , partitioning , and data minimization techniques . Provide leadership and guidance to junior developers, conduct code reviews, and ensure adherence to best practices. Troubleshoot and resolve complex data integration issues promptly. Stay updated on new features and industry trends in data integration and cloud technologies. Required Skills 5+ years of IT experience in data integration and management roles. 4+ years of hands-on experience with Informatica products . 2+ years specifically working with Informatica IDMC/CDI . Experience in leading large projects and mentoring junior developers. Proficiency with Secure Agents , runtime configuration, parameterization , and environment management. Experience integrating external applications with IDMC via batch , APIs , and message queues . Strong experience with relational databases such as Oracle, SQL Server, or MySQL. Solid understanding of data warehousing concepts (e.g., Kimball, Inmon). Excellent problem-solving skills and ability to handle complex data integration scenarios. Strong verbal and written communication skills. Experience working in Agile/Scrum environments. Preferred/Bonus Skills Experience with cloud platforms : AWS, Azure, or GCP. Exposure to data quality tools and governance frameworks. Familiarity with scripting languages like Python or Shell for automation. Why Join? Work with cutting-edge data integration tools as part of a fast-growing, elite consulting firm. Collaborate with top-tier clients and deliver impactful data management solutions. Access to ongoing training and opportunities to work with the latest in cloud , AI , and big data technologies.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary Platform Engineer Informatica Primary : Informatica with BDM & Power Center, IDMC experience Secondary : DBT, Airflow Provide technical support to customers and internal teams for Data integration & Visualization platforms (Informatica with BDM & Power Center, IDMC experience Secondary skills: DBT, Airflow). Investigate and troubleshoot software issues reported by users, identify root causes, and implement effective solutions. Collaborate with developers and QA teams to test and validate software fixes and enhancements. Create and maintain documentation related to application support processes, known issues, and resolutions. Debug application code to resolve the issue or share workaround Document all resolutions into SOP/KEDB Work on Problem management and root cause analysis of recurring tickets

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also be responsible for troubleshooting issues and providing guidance to team members, fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and security best practices.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Data Integrator (ODI) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project specifications, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in troubleshooting and optimizing application performance, while maintaining a focus on delivering high-quality solutions that align with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Data Integrator (ODI).- Strong understanding of data integration techniques and best practices.- Experience with ETL processes and data warehousing concepts.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Data Integrator (ODI).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in the development process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with programming languages such as Python or Scala.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : BTECH Summary :seeking a hands-on Senior Engineering Manager of Data Platform to spearhead the development of capabilities that power Vertex products while providing a connected experience for our customers. This role demands a deep engineering background with hands-on experience in building and scaling production-level systems. The ideal candidate will excel in leading teams to deliver high-quality data products and will provide mentorship, guidance, and leadership.In this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform. Roles & Responsibilities:Be hands-on in leading the development of features that enhance self-service capabilities of our data platform, ensuring the platform is scalable, reliable, and fully aligned with business objectives, and defining and implementing best practices in data architecture, data modeling, and data governance.Work closely with Product, Engineering, and other departments to ensure the data platform meets business requirements.Influence cross-functional initiatives related to data tools, governance, and cross-domain data sharing. Ensure technical designs are thoroughly evaluated and aligned with business objectives.Determine appropriate recruiting of staff to achieve goals and objectives. Interview, recruit, develop and retain top talent.Manage and mentor a team of engineers, fostering a collaborative and high-performance culture, and encouraging a growth mindset and accountability for outcomes. Interpret how the business strategy links to individual roles and responsibilities.Provide career development opportunities and establish processes and practices for knowledge sharing and communication.Partner with external vendors to address issues, and technical challenges.Stay current with emerging technologies and industry trends in field to ensure the platform remains cutting-edge. Professional & Technical Skills: 12+ years of hands-on experience in software development (preferably in the data space), with 3+ years of people management experience, demonstrating success in building, growing, and managing multiple teams.Extensive experience in architecting and building complex data platforms and products. In-depth knowledge of cloud-based services and data tools such as Snowflake, AWS, Azure, with expertise in data ingestion, normalization, and modeling.Strong experience in building and scaling production-level cloud-based data systems utilizing data ingestion tools like Fivetran, Data Quality and Observability tools like Monte Carlo, Data Catalog like Atlan and Master Data tools like Reltio or Informatica.Thorough understanding of best practices regarding agile software development and software testing.Experience of deploying cloud-based applications using automated CI/CD processes and container technologies.Understanding of security best practices when architecting SaaS applications on cloud Infrastructure.Ability to understand complex business systems and a willingness to learn and apply new technologies as needed.Proven ability to influence and deliver high-impact initiatives. Forward-thinking mindset with the ability to define and drive the teams mission, vision, and long-term strategies.Excellent leadership skills with a track record of managing teams and collaborating effectively across departments. Strong written and communication skills.Proven ability to work with and lead remote teams to achieve sustainable long-term success.Work together and Get Stuff Done attitude without losing sight of quality, and a sense of responsibility to customers and the team. Additional Information:- The candidate should have a minimum of 12 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification BTECH

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will engage in problem-solving discussions, contribute innovative ideas, and refine applications based on user feedback, all while maintaining a focus on quality and efficiency in your work. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Collaborate with cross-functional teams to ensure alignment on project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Search Engine Optimization (SEO) Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Were Accenture Marketing Operations. Were the global managed services arm of Accenture Interactive. We sit in the Operations Business to take advantage of the industrialized run capabilities leveraging investments from Accenture OperationsOur quest is to activate the best experiences on the planet by driving value across every customer interaction to maximize marketing performance. We combine deep functional and technical expertise to future-proof our clients business while accelerating time-to-market and operating efficiently at scale.We are digital professionals committed to providing innovative, end-to-end customer experience solutions focusing on operating marketing models that help businesses transform and excel in the new world, with an ecosystem that empowers our clients to implement the changes necessary to support the transformation of their businesses.Develop an organic search engine optimization strategy based on client goals and objectives, defining keywords and priority content, and ensuring web/mobile content meets marketing needs. What are we looking for SEO Strategy & Execution Develop and implement SEO best practices, including technical, on-page, and off-page optimization to enhance organic search rankings and website traffic SEO Analytics & Performance Tracking Analyze SEO metrics (rankings, traffic, CTR) and CRM data (customer interactions, conversions, engagement) using tools like Google Analytics (GA4), Search Console, and CRM dashboards Keyword Research & Content Optimization Conduct keyword research, metadata optimization, and content strategy development to align with search intent and improve visibility Cross-Functional Collaboration Work closely with marketing, sales, and analytics teams to integrate SEO and CRM strategies for a unified customer acquisition and retention approach Trend Monitoring & Compliance Stay updated with Google algorithm updates, evolving SEO trends, GDPR/CCPA regulations, and industry best practices to ensure continued business growth and compliance Experience in data warehousing techniques like snowflake, GCP, Databricks SQL Industry Experience Beauty, CPG, Retail Python Roles and Responsibilities: Understanding of the strategic direction set by senior management as it relates to team goals Able to do keyword research, content optimization, technical SEO improvements, and performance reporting to enhance organic search visibility Prepare actionable insights from SEO tools In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful project outcomes. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring adherence to best practices in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and resolve technical issues related to data modeling. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Informatica PowerCenterMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and design processes.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Informatica PowerCenter.- Strong understanding of database design and optimization techniques.- Experience in developing and maintaining PL/SQL procedures and functions.- Familiarity with application development methodologies and frameworks. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies