Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
1 - 4 Lacs
Hyderabad
Work from Office
Career Category Supply Chain Job Description The Master Data Analyst at Amgen will support the accuracy and consistency of master data (Material, Production, Quality, Customer, Transportation and/or Plant) across the organization. This role will manage data validation, cleansing, and enrichment while collaborating with teams to resolve issues and ensure data integrity. The analyst will support key performance monitoring, data governance, and compliance efforts, as well as assist in data migration and integration projects. Candidates should have experience in enterprise applications like SAP or Oracle , f amiliarity with data governance frameworks and compliance standards , and strong analytical skills. Roles Responsibilities: Perform data operations tasks, mainly maintenance , and validation, to ensure the accuracy and integrity of master data Support process optimization initiatives to improve data management workflows and enhance efficiency Conduct data analysis to identify trends, discrepancies, and opportunities for improvement Provide training and support to partners, customers , and end-users on master data processes, tools, and best practices. M aintain data quality reports to monitor performance metrics and ensure data compliance. Collaborate cross-functionally with business, IT, and operations teams to resolve data-related issues and ensure alignment with organizational goals. Basic Qualifications and Experience: Bachelor s degree in a STEM discipline and 2-3 years of experience in SAP ECC, master data management, data governance, or data operations, preferably in the healthcare or biotech supply chains Technical Proficiency : Experience in SAP /Oracle , Microsoft Office (Excel, PowerPoint), and other data management tools (e.g., Informatica, Oracle MDM). Analytical Skills : Strong ability to analyze large datasets and deliver actionable insights. Problem Solving : Skilled at identifying root causes of data issues and implementing effective solutions. Attention to Detail : High accuracy and attention to detail, with a strong focus on data quality. Communication : Excellent written and verbal communication skills, with the ability to present findings to both technical and non-technical stakeholders. Functional Skills: Must-Have Skills : Working knowledge of SAP/Oracle Understanding of master data management processes, frameworks, and governance. Proficiency in Excel and MS Office Suite, with experience in data analysis Basic understanding of data governance frameworks and ensuring data accuracy and quality. Strong communication skills for presenting data insights to both technical and non-technical audiences. Good-to-Have Skills: SAP S/4, SAP MDG, SAP TM Professional Certifications (please mention if the certification is preferred or mandatory for the role): Soft Skills: Good analytical and troubleshooting skills. Strong verbal and written communication skills . Ability to work effectively with global, virtual teams . High degree of initiative and self-motivation , centered aroun d data perfection Team-oriented, with a focus on achieving team goals . .
Posted 1 week ago
2.0 - 4.0 years
8 - 12 Lacs
Pune
Work from Office
0px> Who are we Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5. 00 billion in fiscal 2024. For more information, visit www. amdocs. com In one sentence The business/ systems analysis specialist has a broad and deep knowledge of customers business environments, operations, relevant regulatory, compliance and governance frameworks as well as industry standard processes, key trends, customer segments and lines of businesses. He/ she demonstrates this knowledge and experience to advise on, capture and define business and system requirements and processes that best reflect customer needs, based on product capabilities and best practices. What will your job look like ETL Work Description Design & Build : Extracts from OEC, ASPA, SFDC-Cloudsense, built in Informatica to produce a unified input file for migration. Testing Support : Across all stages. Migration Support : Loading data into SFDC (Billing Accounts, Assets replacing subscriptions, Contacts). Data Cleanup : Elisa data cleanup using extracts. B2B Support : Implementation of B2B extracts and support during analysis. SFDC Work Description Extracts & Loads : Design for Billing Accounts, Assets, Contacts. Product Mapping : From legacy SFDC products to new Core Commerce products. Migration Execution : Subscriptions, service removals, agreement removals. Testing & Integration : Support for testing and migration of entities from SFDC to ABP. Error Analysis : Support for analyzing and designing data cleanup errors. B2B Analysis : Within SFDC-Cloudsense All you need is. . . Computer Science Degree or Industrial Engineering & Management - Information System 10 years of experience in telecom industry and/or IT, with at least 5 years practical experience in business / system analysis, working with customer on requirement and business processes definition, in a variety of complex situations. Customer-facing experience: Ability to lead and facilitate sessions, resolve challenges and suggest solutions using various methods (presentations, demos, business processes etc. ) Wide knowledge & experience in set of products & master in E2E Business processes at least in one, if not most business domains. Proficient experience in partner and client management. Why you will love this job: Accurately use insights of customers business environments to influence decisions impacting customer efficient solution design. Be a key member of a global, dynamic and highly collaborative team with various possibilities for personal and professional development. Get the opportunity to work in a multinational environment for the global market leader in its field. We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave!
Posted 1 week ago
2.0 - 4.0 years
8 - 12 Lacs
Pune
Work from Office
0px> Who are we Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5. 00 billion in fiscal 2024. For more information, visit www. amdocs. com In one sentence The business/ systems analysis specialist has a broad and deep knowledge of customers business environments, operations, relevant regulatory, compliance and governance frameworks as well as industry standard processes, key trends, customer segments and lines of businesses. He/ she demonstrates this knowledge and experience to advise on, capture and define business and system requirements and processes that best reflect customer needs, based on product capabilities and best practices. What will your job look like ETL Work Description Design & Build : Extracts from OEC, ASPA, SFDC-Cloudsense, built in Informatica to produce a unified input file for migration. Testing Support : Across all stages. Migration Support : Loading data into SFDC (Billing Accounts, Assets replacing subscriptions, Contacts). Data Cleanup : Elisa data cleanup using extracts. B2B Support : Implementation of B2B extracts and support during analysis. SFDC Work Description Extracts & Loads : Design for Billing Accounts, Assets, Contacts. Product Mapping : From legacy SFDC products to new Core Commerce products. Migration Execution : Subscriptions, service removals, agreement removals. Testing & Integration : Support for testing and migration of entities from SFDC to ABP. Error Analysis : Support for analyzing and designing data cleanup errors. B2B Analysis : Within SFDC-Cloudsense All you need is. . . Why you will love this job: Accurately use insights of customers business environments to influence decisions impacting customer efficient solution design. Be a key member of a global, dynamic and highly collaborative team with various possibilities for personal and professional development. Get the opportunity to work in a multinational environment for the global market leader in its field. We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave!
Posted 1 week ago
5.0 years
4 - 7 Lacs
Hyderābād
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : The WBD Integration team is seeking a Senior Integration Developer who will be responsible for providing technical expertise for supporting and enhancing the Integration suite (Informatica Power Center, IICS). We are specifically looking for a candidate with solid technical skills and experience in integrating ERP applications, SAAS, PAAS platforms such as SAP, Salesforce, Workday, etc., and data warehouses such as Teradata, Snowflake, and RedShift. Experience with the Informatica cloud platform would be ideal for this position. The candidate's primary job functions include but are not limited to the day-to-day configuration/development of the Informatica platform. The candidate must possess strong communication and analytical skills to effectively work with peers within the Enterprise Technology Group, various external partners/vendors, and business users to determine requirements and translate them into technical solutions. The candidate must have the ability to independently complete individual tasks in a dynamic environment to achieve departmental and company goals. Qualifications & Experiences: Leads the daily activities of a work group / team. Typically leads complex professional undertakings or teams. May be assigned to new work groups or teams. Interacts with peers and other internal and external stakeholders regularly. Demonstrates advanced proficiency in full range of skills required to perform the role. Acts as a Mentor Work on POC with new connectors and closely work with Network team. Performing peer review of objects developed by other developers in team as needed. Work with Business and QA Team during various phases of deployment i.e., requirements, QAST, SIT phases Report any Functional gaps in existing Application and suggest business process improvements and support for bug fixes and issues reported Coordinating activities between the different LOB’s/teams Translate conceptual system requirements into technical data and integration requirements Proficient in using Informatica Cloud application and data integrations is a must Proficient in developing custom API’s to handle bulk volumes, pagination etc. Design, develop, and implement integration solutions using Informatica Intelligent Cloud Services. Configure data mappings, transformations, and workflows to ensure data consistency and accuracy. Develop and maintain APIs and connectors to integrate with various data sources and applications. Prepare data flow diagramming and/or process modeling Strong knowledge of integration protocols and technologies (e.g., REST, SOAP, JSON, XML). Perform Unit Testing and debugging of applications to ensure the quality of the delivered requirements and overall health of the system Develop standards and processes to support and facilitate integration projects and initiatives. Educate other team members and govern tool usage Participate in research and make recommendations on the integration products and services Monitor integration processes and proactively identify and resolve performance or data quality issues. Provide ongoing maintenance and support for integration solutions. Perform regular updates and upgrades to keep integrations current. Proficiency in Informatica Intelligent Cloud Services. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Excellent problem-solving and troubleshooting skills. Strong communication and teamwork skills. Qualifications & Experiences: 5+ year's Developer Experience in Informatica IICS Application Integration and Data Integration. Experience in PowerCenter along with IICS and complete knowledge of SDLC Process Experience in API development, including best practices, testing methods and deployment strategies. Experience in designing, creating, refining, deploying, and managing the organization's data architecture including the end-to-end vision for how data will flow from system to system, for multiple applications and across different territory. Expertise in with the tools like SOA, ETL, ERP, XML etc Understanding of Python, AWS Redshift, Snowflake and Relational Databases Knowledge of UNIX Shell scripts and should be able to write/debug Shell Scripts. Ability to work well within an agile team environment and apply related working methods. Able to analyze and understand complex customer scenario's and thrives on difficult challenges Team player, multitasker, excellent communication skills (convey highly technical information into business terms, clear email communications), ability to mentor team members. Preferred Qualifications: Informatica certification in Informatica Intelligent Cloud Services. Experience with other integration tools and middleware. Knowledge of data governance and data quality best practices. Not Required but preferred experience: Public speaking and presentation skills. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 1 week ago
7.0 years
5 - 10 Lacs
Hyderābād
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Become part of Operations, Support, & Maintenance team Need someone who is technical and who can review existing scripts and code to debug, fix code/scripts, enhance code. Should be able to write intermediate level SQL and MongoDB queries. This is needed to support customers with their issues and enhancement requests Support applications/products/platforms during testing & post-production Develop new code / scripts. Not heads-down development! Analyze & report on data, manage data (data validation, data clean up) Monitor scheduled jobs and take proactive actions. Resolve job failures, and communicate to stakeholders Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's Degree 7+ years of Relational Database experience (Oracle, SQL Server, DB2, etc.) 5+ years of ETL tool experience (Talend, SQL Server SSIS, Informatica, etc.) 5+ years of programming experience (e.g. Java, Javascript, Visual Basic, etc.) 3+ years of experience with NOSQL (e.g. MongoDB) 3+ years of experience in SDLC development process 2+ years of experience in job scheduler (e.g. Rundeck, Tivoli) Thorough understanding of Production control such as change control process Thorough understanding of REST API services Preferred Qualifications: Understanding of the following: Vaults such as CyberArk, Hashi Corp Document Management such as Nuxeo Version control tool (Git) technology Healthcare terminology Atlassian Tools such as JIRA / Bit Bucket / Crowd / Confluence ETL / ELT tool such as FiveTran Understanding of Agile methodology At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 1 week ago
1.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ - K20 Job ID: A3040337
Posted 1 week ago
5.0 years
10 - 20 Lacs
Hyderābād
On-site
Greeting From Ashra Technologies We are Hiring Role: Informatica Powercenter Exp: 5+Years Locations: Hyderabad/Noida 5 to 7 years of experience with Informatica Power Center , IDMC on Data Warehousing or Data Integration projects. Expert with data warehousing standards, strategies and tools. Expert with SDLC processes. Experience in Informatica Power exchange for Mainframe, Salesforce and other new age data sources. Experience in Informatica Webservices, XML, Java Transformations etc. Strong knowledge of relational database preferably ORACLE and SQL Server. Good knowledge of UNIX/LINUX shell scripting. Strong SQL background. Strong problem-solving, multi-tasking and organizational skills. Good written and verbal communication skills. Experience working with US or other international customers. Demonstrated experience of leading a team spread across multiple locations. Preferred Qualification AWS Cloud Knowledge, EC2, S3, AWS Glue ETL. TWS/Tidal scheduler hands-on experience. BFSI domain knowledge. Good to have experience in Informatica Cloud Interested share your resume to akshitha@ashratech.com/8688322632 Job Type: Full-time Pay: ₹1,000,000.00 - ₹2,000,000.00 per year Benefits: Cell phone reimbursement Work Location: In person
Posted 1 week ago
3.0 - 7.0 years
4 - 8 Lacs
Hyderābād
On-site
3 to 7 years of experience with Informatica Power Center , IDMC on Data Warehousing or Data Integration projects. Expert with data warehousing standards, strategies and tools. Expert with SDLC processes. Experience in Informatica Power exchange for Mainframe, Salesforce and other new age data sources. Experience in Informatica Webservices, XML, Java Transformations etc. Strong knowledge of relational database preferably ORACLE and SQL Server. Good knowledge of UNIX/LINUX shell scripting. Strong SQL background. Strong problem-solving, multi-tasking and organizational skills. Good written and verbal communication skills. Experience working with US or other international customers. Demonstrated experience of leading a team spread across multiple locations.
Posted 1 week ago
5.0 years
25 Lacs
Hyderābād
On-site
Job Information Date Opened 07/23/2025 Job Type Full time Industry Technology Salary 25 LPA State/Province Karnataka Zip/Postal Code 560001 City Bangalore & Hyderabad Country India About Us At Innover, we endeavor to see our clients become connected, insight-driven businesses. Our integrated Digital Experiences, Data & Insights and Digital Operations studios help clients embrace digital transformation and drive unique outstanding experiences that apply to the entire customer lifecycle. Our connected studios work in tandem to reimagine the convergence of innovation, technology, people, and business agility to deliver impressive returns on investments. We help organizations capitalize on current trends and game-changing technologies molding them into future-ready enterprises. Take a look at how each of our studios represents deep pockets of expertise and delivers on the promise of data-driven, connected enterprises. Job Description Summary: We are seeking a highly motivated and self-driven Business Intelligence Engineer with 5+ years of experience in enterprise and/or hosted applications development with demonstrable, proven expertise in cloud application development, design, and implementation. Broad experience in various types of databases, distributed, and web-based reporting & analytics applications based. Technical Skills and Key Interests: Minimum Qualifications Bachelor’s in computer science or equivalent work experience. Minimum 2 years of experience in development of reports, views, summarizations, and dashboards for at least one Reporting tool such as SAP BO, Tableau, Cognos, PowerBI, TIBCO Jaspersoft. Experienced in Jaspersoft or willing to learn TIBCO Jaspersoft report development (willing to grow expertise from an elementary to an advanced level of expertise) Experienced in ETL development in Technologies in one or more technologies such as Informatica, Hadoop, Spark as an example (Overall ETL + Report development combined experience should be 5 or more) Additional Technical Skills (Value-add): Experience in Java, spring framework, object-oriented design methodologies would be a value-add. Experience in using AI tools for problem solving and development. Hands on experience of building cloud native applications for one or more cloud provider e.g., AWS, Google Cloud would be a value-add. Responsibilities: The area of development would be on Report development in TIBCO Jaspersoft and ETL (SQL, PySpark). If you have an experience in other reporting tools then you should be willing to learn TIBCO Jaspersoft at the initial period of your tenure with the team. Development would involve one or both of these areas depending on the business priorities. You will be responsible for designing and implementing product enhancements, design of product functions, troubleshooting and resolving product defects, unit and integration testing. You will be responsible for enhancing and building features end to end for a reporting-based application involving sourcing data from relational, non-relational databases, transforming using complex sql queries and developing reports. Active interaction with internal customers, other developers, Quality Assurance, Business System Analysts is an integral part of the role. Some of the key tasks you will perform include: Participate in project/work planning sessions to analyze and understand requirements to the level of being able to contribute to their creation, in collaboration with capability/product and/or business owners. Develop and integrate applications per specification and translate technical requirements into application code and modules Approach development work with a DevOps and continuous integration mindset. Ensure consistency with cloud architectural guiding principles for assigned projects Develop prototype or "proof of concept" implementations of projects where the technical solution is unknown or unproven Proactive in raising problems, identifying solutions and giving/receiving feedback Assist in identifying and correcting software performance bottlenecks Work in a highly collaborative and dynamic agile team environment with multiple levels of technology staff across various geographical locations Providing technical expertise and peer code reviews to other team members and assist team leads and project managers in work break down and story planning. Other specialized knowledge and skills required: Must be able to work independently as well as collaboratively Comfortable working in an offshore-onshore model Proven strong analytical design and trouble-shooting skills Highly accountable for meeting all commitments and deadlines Effective communication skills, both written and verbal for technical and non-technical audiences Drive for continuous process improvement
Posted 1 week ago
3.0 - 5.0 years
7 - 8 Lacs
Hyderābād
On-site
Required Qualifications: 3-5 years of production support experience on Informatica/Python/AWS Technologies and applications. Must have good understanding and technical knowledge on Informatica architecture/client components such as Workflow Manager, Mapping Designer, workflow monitor and Repo manager. Excellent knowledge on AWS/Python concepts. Informatica to Cloud Migration. Hands-on expertise in debugging Informatica ETL Mapping to narrow down the Issue. Hands-on experience in ETL transformation such as lookup/joiners/source qualifier/normalizer. Hands-on experience in dealing with various types of sources such as Flat files/Mainframes/XML files and Databases. Experience on AWS environment, Data Pipelines, RDS, Reporting tools. Hands-on experience in Unix scripting/file operations. Strong knowledge of SQL/PL-SQL and oracle Databases. Able to debug complex queries. Good understanding on scheduling tool such as TWS/TIDAL/Others Worked at least 2 years on ServiceNow for application incident management, problem management in a 24*7 model. Strong communication skills both written and verbal with the ability to follow the processes Preferred qualifications: Experience working with US Clients and Business partners. Exposure to BFSI domain is a good to have. Experience in mainframe technologies will be plus. Job responsibilities: Provide production support for Informatica/Python/AWS suit of applications in 24X7 environment. Good understanding of Informatica/Python and AWS environment and able to handle batch recoveries and provide batch support. Assess & recommend solutions for permanent fixes to improve application stability and resiliency. Ability to handle production incident bridge calls for P1 and high priority P2 incidents Strong analytical capability to do independent ticket analysis and resolution. Incident Management/Problem management and Change management. Ability to do the Root cause analysis, recap the issues and problem in an email and communicate to all stakeholders and cross commit teams. Ability to drive production issues bridges and able to work across the teams to collate the impacts and send leadership communications. Propose solutions to perform complex troubleshooting. Should be able to identify the areas of improvement and re-engineering scope. Helping other team members to resolve their technical issues. Mentoring the interns and new joiners. Monitor and report issues and work with the required team/vendor for quick resolution
Posted 1 week ago
4.0 years
0 Lacs
Gurgaon
On-site
A Day in Your Life at MKS: Summary & Objectives We are looking for an exceptional Senior Systems Analyst who can perform development, implementation and usage of Information Technology and management information systems with a focus on application integrations. Working in partnership with the business relationship managers, super-users, end-users and technical team to ensure full adoption, effective usage, and efficient deployment of our IT solutions. Effectively manage the change control process, gathering the end-user requirements, and communicating IT priorities and delivery status to the business. You Will Make an Impact By Major Duties and Responsibilities: Collaborate with business partners to understand and document integration processes and solutions Work closely with development team to implement and test required functionality Actively demonstrate a passion for continuous improvement focused on end user productivity, and enterprise process integration. Work with various business groups in the organization to facilitate cross-functional implementation of new or improved business process requirements for all IT-related business, financial, and operations systems critical to core organizational functions. Effectively manage the IT change control process, gathering the end-user requirements, preparing functional specifications and communicating IT priorities and delivery status to the business. Skills You Bring: Bachelor's degree in computer science, Information Technology, Information Systems or other 4-year degree focusing on information technology 6+ years of systems analyst experience with a focus on application integrations required with 2 years of Informatica Intelligent Data Management Cloud or SAP PI experience preferredStrong knowledge of SQL and DDL scripting Strong communication skills with experience drafting technical documents Be dissatisfied with status quo with a thirst to introduce change Energetic team player with a can-do attitude Globally, our policy is to recruit individuals from wide and diverse backgrounds. However, certain positions require access to controlled goods and technologies subject to the International Traffic in Arms Regulations (ITAR) or Export Administration Regulations (EAR). Applicants for these positions may need to be “U.S. persons.” “U.S. persons” are generally defined as U.S. citizens, noncitizen nationals, lawful permanent residents (or, green card holders), individuals granted asylum, and individuals admitted as refugees. MKS Instruments, Inc. and its affiliates and subsidiaries (“MKS”) is an affirmative action and equal opportunity employer: diverse candidates are encouraged to apply. We win as a team and are committed to recruiting and hiring qualified applicants regardless of race, color, national origin, sex (including pregnancy and pregnancy-related conditions), religion, age, ancestry, physical or mental disability or handicap, marital status, membership in the uniformed services, veteran status, sexual orientation, gender identity or expression, genetic information, or any other category protected by applicable law. Hiring decisions are based on merit, qualifications and business needs. We conduct background checks and drug screens, in accordance with applicable law and company policies. MKS is generally only hiring candidates who reside in states where we are registered to do business. MKS is committed to working with and providing reasonable accommodations to qualified individuals with disabilities. If you need a reasonable accommodation during the application or interview process due to a disability, please contact us at: accommodationsatMKS@mksinst.com . If applying for a specific job, please include the requisition number (ex: RXXXX), the title and location of the role
Posted 1 week ago
0 years
4 - 7 Lacs
Noida
On-site
POSITION OVERVIEW : Technical Analyst POSITION GENERAL DUTIES AND TASKS : Informatica Technical Analyst We are seeking a highly skilled and experienced Informatica Lead to join our IT team. The ideal candidate will lead a team of ETL developers and oversee the design, development, and implementation of ETL solutions using Informatica PowerCenter and Cloud Data Integration. This role requires expertise in data integration, leadership skills, and the ability to work in a dynamic environment to deliver robust data solutions for business needs. Responsibilities: Primary Skillset - Informatica, Data Mapping, Data modelling, , Scripting language, SQL, TOAD, Oracle. � S2T Mapping. Documenting the business, functional, mapping specs. Complex SQL, Exposure in working Hybrid/Agile Methodology. � Attend Requirements/BRD/Brainstorming sessions to understand Business requirements/changes. � Design technical documents like STM (Source to Target Mapping), EMD (Extract mapping document), TRD (Technical Requirement Document) to develop Business Requirements/changes. � Conduct technical documents hand-off session with Developers and QA�s. � Review Test plan and Test scenarios created by QA team and Sign-off. � Review Test Results created by QA Team and review comments for any additional or regression TCs to be covered. � Monitoring Production deployment and conduct smoke test in production for recent deployments. � ETL Development and Maintenance: Lead the design, development, and maintenance of ETL workflows and mappings using Informatica PowerCenter and Cloud Data Integration. Ensure the reliability, scalability, and performance of ETL solutions to meet business requirements. Optimize ETL processes for data integration, transformation, and loading into data warehouses and other target systems. � Solution Architecture and Implementation: Collaborate with architects and business stakeholders to define ETL solutions and data integration strategies. Develop and implement best practices for ETL design and development. Ensure seamless integration with on-premises and cloud-based data platforms. � Data Governance and Quality: Establish and enforce data quality standards and validation processes. Implement data governance and compliance policies to ensure data integrity and security. Perform root cause analysis and resolve data issues proactively. � Team Leadership: Manage, mentor, and provide technical guidance to a team of ETL developers. Delegate tasks effectively and ensure timely delivery of projects and milestones. Conduct regular code reviews and performance evaluations for team members. � Automation and Optimization: Develop scripts and frameworks to automate repetitive ETL tasks. Implement performance tuning for ETL pipelines and database queries. Explore opportunities to improve efficiency and streamline workflows. � Collaboration and Stakeholder Engagement: Work closely with business analysts, data scientists, and application developers to understand data requirements and deliver solutions. Communicate project updates, challenges, and solutions to stakeholders effectively. Act as the primary point of contact for Informatica-related projects and initiatives. Required Skills: Experience in ETL development and data integration. Proven experience with Informatica PowerCenter, Informatica Cloud Data Integration, and large-scale ETL implementations. Experience in integrating data from various sources such as databases, flat files, and APIs. Preferred Skills: Strong expertise in Informatica PowerCenter, Informatica Cloud, and ETL frameworks. Proficiency in SQL, PL/SQL, and performance optimization techniques. Knowledge of cloud platforms like AWS, Azure, or Google Cloud. Familiarity with big data tools such as Hive, Spark, or Snowflake is a plus. Strong understanding of data modeling concepts and relational database systems.?
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Noida, Chennai
Hybrid
Must Have Experience in ETL Testing Must Have Experience in DWH Testing Must Have Experience in IFORMATICA Must Have Experience in Banking Domain Strong SQL skills to perform database queries, data validations, and data integrity checks. Familiarity with relational databases and data management concepts. Working experience with cloud-based data warehouse platforms like Snowflake and Azure. Experience in creating and implementing ETL testing strategy Experience in data integrity, data accuracy and completeness testing
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: ETL Developer Location: Hyderabad (5 days WFO) Experience Required: 4+ years in ETL Developer We are looking for a talented Talend Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing team. The ideal candidate will play a key role in building and optimizing ETL/ELT data pipelines, integrating complex data systems, and ensuring high performance across cloud environments. While experience with Informatica is a plus, it is not mandatory for this role. As a Talend Developer, you will be responsible for designing, developing, and maintaining data integration solutions to meet the organization’s growing data needs. You will collaborate with business stakeholders, data architects, and other data professionals to ensure the seamless and secure movement of data across platforms, ensuring scalability and performance. Key Responsibilities: Develop and maintain ETL/ELT data pipelines using Talend Management Console on Cloud to integrate data from various on-premises and cloud-based sources. Design, implement, and optimize data flows for data ingestion, processing, and transformation in Snowflake to support analytical and reporting needs. Utilize Talend Management Console on Cloud to manage, deploy, and monitor data integration jobs, ensuring robust pipeline management and process automation. Collaborate with data architects to ensure that the data integration solutions align with business requirements and follow best practices. Ensure data quality, performance, and scalability of Talend-based data solutions. Troubleshoot, debug, and optimize existing ETL processes to ensure smooth and efficient data integration. Document data integration processes, including design specifications, mappings, workflows, and performance optimizations. Collaborate with the Snowflake team to implement best practices for data warehousing and data transformation. Implement error-handling and data validation processes to ensure high levels of accuracy and data integrity. Provide ongoing support for Talend jobs, including post-deployment monitoring, troubleshooting, and optimization. Participate in code reviews and collaborate in an agile development environment. Required Qualifications: 2+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. Experience with SQL and working with relational databases to extract and manipulate data. Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. Strong knowledge of data integration, data quality, and performance optimization in Talend. Ability to troubleshoot and resolve issues in data integration jobs and processes. Solid understanding of data modeling concepts and best practices for building scalable data pipelines. Preferred Qualifications: Experience with Informatica is a plus but not mandatory. Experience with scripting languages such as Python or Shell scripting for automation. Familiarity with CI/CD pipelines and working in DevOps environments for continuous integration of Talend jobs. Knowledge of data governance and data security practices in cloud environments.
Posted 1 week ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
We are seeking a talented individual to join our Metrics, Analytics & Reporting team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Manager - Metrics, Analytics & Reporting ( Scrum Master) We will count on you to: Promoting Agile principles and practices across teams, ensure Agile / Scrum concepts and principles are adhered to, and where necessary coach the teams in implementing and practicing Agile principles. Acting as a bridge between development teams and stakeholders. Foster a culture of trust, collaboration, and accountability. Organize, and facilitate Scrum ceremonies for Scrum teams. Track Scrum metrics including team velocity and sprint / release progress and communicate this internally and externally, improving transparency Help and coach the product owner to establish and enforce sprint priorities and release delivery deadlines. Ensure business objectives are understood and achieved by as per sprint commitments. Identifying and removing obstacles to team progress. Prevent distractions that interfere with the ability of the team to deliver the sprint goals, through mediation, arbitration, mitigation and addressing impediments with the team members and the organizational hierarchy. Enabling self-organizing, cross-functional teams. Ensure DOR is met for all prioritized requirements. Encourage DOD and the importance of Driving a collaborative and supportive team culture through team building and engagement practices. Drive continuous improvement through team retrospectives and facilitating process enhancements. Identify and resolve conflicts, promote constructive dialogue, and encourage innovation. Work closely with other Scrum Masters to align cross-team dependencies and best practices. What you need to have: 6+ years of experience as a Scrum Master in a distributed Agile team with CSM or equivalent certification. Solid understanding of Agile frameworks (Scrum, Kanban, SAFe, etc.). Proficiency in Jira/Confluence and Azure Dev Ops and familiarity with different Agile practices such as Kanban/Lean. Proven track record of being a servant/leader in a Scrum team, driving teams and removing blockers, and improving processes through retrospectives. Strong facilitation, conflict resolution, and mentoring skills. Ability to assist technical team members and senior non-technical product owners in making appropriate decisions (Stakeholder Management). Comfortable with responsibility for delivering results and resilient enough to handle pressure in balancing time, quality, and scope. Proven ability to coach and mentor others, positive approach to complex problems, and a can-do attitude. Assertive and fact-based communicator, able to explain technical issues to a business audience and vice versa. Experience as a self-starter in a rapidly evolving and ambiguous environment, continuously learning and problem-solving quickly. Ability to identify and articulate risks and constructively challenge assumptions. Strong team player with Influencing and negotiation skills in a virtual/remote environment, working with customers/ developers across the globe. Excellent communication and interpersonal skills. Experience working with distributed or hybrid teams. What makes you stand out? Understanding of the Data Quality domain and experience in delivering KPI dashboards Track record of successful Agile transformations or scaling initiatives Strong analytical mindset with a data-driven approach to problem-solving. Exposure to solutions such as SQL, QlikView, Qlik Sense, Informatica DQ , Power BI Strong insurance and / or insurance broking business domain knowledge SAFE 6 Certification would be a big Plus. Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Marsh, a business of Marsh McLennan (NYSE: MMC), is the world’s top insurance broker and risk advisor. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marsh.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one “anchor day” per week on which their full team will be together in person.
Posted 2 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 07/23/2025 Job Type Permanent RSD NO 10371 Industry IT Services Min Experience 15+ Max Experience 15+ City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600018 Job Description Job Summary: We are seeking a Data Architect to design and implement scalable, secure, and efficient data solutions that support Convey Health Solutions' business objectives. This role will focus on data modeling, cloud data platforms, ETL processes, and analytics solutions, ensuring compliance with healthcare regulations (HIPAA, CMS guidelines). The ideal candidate will collaborate with data engineers, BI analysts, and business stakeholders to drive data-driven decision-making. Key Responsibilities: Enterprise Data Architecture: Design and maintain the overall data architecture to support Convey Health Solutions’ data-driven initiatives. Cloud & Data Warehousing: Architect cloud-based data solutions (AWS, Azure, Snowflake, BigQuery) to optimize scalability, security, and performance. Data Modeling: Develop logical and physical data models for structured and unstructured data, supporting analytics, reporting, and operational processes. ETL & Data Integration: Define strategies for data ingestion, transformation, and integration, leveraging ETL tools like INFORMATICA, TALEND, DBT, or Apache Airflow. Data Governance & Compliance: Ensure data quality, security, and compliance with HIPAA, CMS, and SOC 2 standards. Performance Optimization: Optimize database performance, indexing strategies, and query performance for real-time analytics. Collaboration: Partner with data engineers, software developers, and business teams to align data architecture with business objectives. Technology Innovation: Stay up to date with emerging data technologies, AI/ML applications, and industry trends in healthcare data analytics. Required Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Experience: 7+ years of experience in data architecture, data engineering, or related roles. Technical Skills: Strong expertise in SQL, NoSQL, and data modeling techniques Hands-on experience with cloud data platforms (AWS Redshift, Snowflake, Google BigQuery, Azure Synapse) Experience with ETL frameworks (INFORMATICA, TALEND, DBT, Apache Airflow, etc.) Knowledge of big data technologies (Spark, Hadoop, Data-bricks) Strong understanding of data security and compliance (HIPAA, CMS, SOC 2, GDPR) Soft Skills: Strong analytical, problem-solving, and communication skills. Ability to work in a collaborative, agile environment. Preferred Qualifications: Experience in healthcare data management, claims processing, risk adjustment, or pharmacy benefit management (PBM). Familiarity with AI/ML applications in healthcare analytics. Certifications in cloud data platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.
Posted 2 weeks ago
10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Job Family: Artificial Intelligence & Machine Learning Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-46721-2025 Description & Requirements Introduction: A Career at HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences Role : Data Architect with Microsoft Azure + Fabric + Purview Skill Experience Required: 10+ Years Key Responsibilities of the role include: Data Engineer Develop and implement data engineering project including data lakehouse or Big data platform Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Cloud native data platform experience in Microsoft Data stack including – Azure data factory, Databricks on Azure Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders. Build re-usable Methodologies, Pipelines & Models Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as Azure datalake , Synapse, Azure data factory and AWS glue , AWS Redshift and Azure SQL. Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. People & Interpersonal Skills Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What is required for the role? 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as Azure data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Data Governance experience is mandatory MS Fabric Certified Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired Educational Qualification: A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Benefits: Opportunities for professional growth and development. Collaborative and supportive work environment. What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.(www.harman.com)
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Educational Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-ETL & Data Quality-ETL - Others Preferred Skills: Technology-ETL & Data Quality-ETL - Others
Posted 2 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Job Title : Salesforce Architect – Data Cloud & Marketing Cloud Hiring Locations : Bangalore, Pune, Trivandrum, Kochi, Hyderabad, Chennai Experience Range Total IT Experience: 8+ years Salesforce Marketing Cloud Experience: Minimum 5 years (hands-on) Salesforce Data Cloud (CDP) Experience: Minimum 2 years Leadership Experience: Experience in leading cross-functional teams and mentoring junior architects Must Have Skills Platform Expertise Strong hands-on experience with Salesforce Data Cloud (formerly CDP): Data unification, identity resolution, calculated insights, segmentation, data streams, harmonization rules Deep hands-on expertise in Salesforce Marketing Cloud: Journey Builder, Email Studio, Mobile Studio, Automation Studio, Contact Builder Development using AMPscript, SSJS, SQL, HTML/CSS, JavaScript Integration experience using REST/SOAP APIs Data model design and audience segmentation for large-scale, multi-channel campaigns Design of real-time and batch-based data ingestion and activation flows Proven ability to translate complex business requirements into scalable Salesforce architecture Strong experience integrating Salesforce Marketing Cloud with Sales Cloud, Service Cloud, and third-party platforms Experience in delivering projects in Agile environments, including sprint planning and estimation Experience with ETL tools like MuleSoft, Informatica, or Talend Ability to create architecture diagrams, reusable frameworks, and technical documentation Awareness of data privacy laws (e.g., GDPR, CAN-SPAM) and compliance standards Good To Have Skills Experience with: Marketing Cloud Personalization (Interaction Studio) Datorama, Pardot, Social Studio AWS / GCP for data storage or event processing Familiarity with: Salesforce Administrator and Platform Developer I capabilities Salesforce Marketing Cloud Personalization Experience developing POCs and custom demos for client presentations Experience working with enterprise architecture frameworks Exposure to data governance, security models, and compliance audits Certifications Required : Salesforce Marketing Cloud Consultant Salesforce Marketing Cloud Developer Salesforce Data Cloud Consultant Nice To Have Salesforce Administrator (ADM201) Platform Developer I Marketing Cloud Personalization Specialist Key Responsibilities Architect and implement unified customer data strategies using Data Cloud Lead technical discussions and requirement-gathering with business and technical teams Design scalable multi-channel SFMC solutions for campaign execution Manage integrations with Salesforce core clouds and external systems Mentor developers, review code/designs, and ensure delivery Create documentation, standards, and best practices Ensure governance, compliance, and high delivery quality across engagements Skills Salesforce,Amp,Javascript
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled ETL Tester with hands-on experience in SQL and Python to join our Quality Engineering team. The ideal candidate will be responsible for validating data pipelines, ensuring data quality, and supporting the end-to-end ETL testing lifecycle in a fast-paced environment. Design, develop, and execute test cases for ETL workflows and data pipelines. Perform data validation and reconciliation using advanced SQL queries. Use Python for automation of test scripts, data comparison, and validation tasks. Work closely with Data Engineers and Business Analysts to understand data transformations and business logic. Perform root cause analysis of data discrepancies and report defects in a timely manner. Validate data across source systems, staging, and target data stores (e.g., Data Lakes, Data Warehouses). Participate in Agile ceremonies, including sprint planning and daily stand-ups. Maintain test documentation including test plans, test cases, and test results. Required qualifications to be successful in this role: 5+ years of experience in ETL/Data Warehouse testing. Strong proficiency in SQL (joins, aggregations, window functions, etc.). Experience in Python scripting for test automation and data validation. Hands-on experience with tools like Informatica, Talend, Apache NiFi, or similar ETL tools. Understanding of data models, data marts, and star/snowflake schemas. Familiarity with test management and bug tracking tools (e.g., JIRA, HP ALM). Strong analytical, debugging, and problem-solving skills. Good to Have: Exposure to Big Data technologies (e.g., Hadoop, Hive, Spark). Experience with Cloud platforms (e.g., AWS, Azure, GCP) and related data services. Knowledge of CI/CD tools and automated data testing frameworks. Experience working in Agile/Scrum teams. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect, and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The team responsible for Business Analytics at Seagate is seeking a skilled individual to join as a SAP BODS Developer. In this role, you will be involved in SAP BODS Development projects, including requirements analysis, solution conception, implementation, and development. Working closely with various cross-functional teams, you will be tasked with developing solutions related to BODS, architecting, developing, and maintaining BODS jobs, as well as designing complex data flows and workflows. Your responsibilities will also include ensuring timely delivery, adherence to scope, and industry best practices. To excel in this role, you should possess expertise in SAP BODS, excellent verbal and written communication skills, and strong analytical capabilities. Familiarity with offshore/onsite work models, the ability to articulate complex problems and solutions clearly, and experience collaborating virtually with professionals from diverse backgrounds are crucial. Problem-solving skills and a collaborative team player mindset are essential traits for success. Ideal candidates will have hands-on experience in SAP BODS tool implementation, designing and developing ETL data flows and jobs, and executing data migration strategies involving SAP BW4HANA and Enterprise HANA. Proficiency in various Data Services transformations, such as Map Operation, Table Comparison, Row-Generation, and SQL transformations, is required. Additionally, knowledge of integrating non-SAP/Cloud systems with SAP BW4HANA using Data Services, SQL/PLSQL, and SAP BW is highly beneficial. Familiarity with BODS administration, SAP BW, and exposure to SDI/SDA/Informatica will be advantageous. The position is based in Pune, India, offering a dynamic work environment with innovative projects and various on-site amenities for personal and professional development. Employees can enjoy meals from multiple cafeterias, participate in recreational activities like walkathons and sports competitions, and engage in technical learning opportunities through the Technical Speaker Series. Cultural festivals, celebrations, and community volunteer activities further enrich the vibrant workplace culture. Join our team in Pune and contribute to our cutting-edge work in Business Analytics by leveraging your SAP BODS expertise and collaborative skills effectively.,
Posted 2 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"
Posted 2 weeks ago
15.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Technology OpS Support Practitioner Project Role Description : Own the integrity and governance of systems, including best practices for delivering services. Develop, deploy and support infrastructures, applications and technology initiatives from an architectural and operational perspective in conjunction with existing standards and methods of delivery. Must have skills : Informatica PowerCenter, Data warehouse implementation, To support TDM licensing packa Good to have skills : NA Minimum 15 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Technology OpS Support Practitioner, you will be responsible for ensuring the integrity and governance of systems while adhering to best practices for service delivery. Your typical day will involve developing, deploying, and supporting infrastructures, applications, and technology initiatives, all while aligning with existing standards and methods of delivery. You will collaborate with various teams to ensure operational excellence and contribute to the overall success of technology initiatives within the organization. Roles & Responsibilities: - Expected to be a Subject Matter Expert with deep knowledge and experience. - Should have influencing and advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Facilitate training sessions and workshops to enhance team capabilities. - Monitor and evaluate the effectiveness of implemented solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter. - Strong understanding of data integration and ETL processes. - Experience with data warehousing concepts and methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 15 years of experience in Informatica PowerCenter. - This position is based at our Mumbai office. - A 15 years full time education is required., 15 years full time education
Posted 2 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Noida, Bengaluru
Work from Office
Description: We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. Requirements: Need a Python that has good EXperience in Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling Job Responsibilities: Need a Python that has good Experiencein Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France