Jobs
Interviews

6 Informatica Etl Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

NTT DATA is looking for an Oracle Business Intelligence (BI) Publisher, XML - Developer to join their team in Chennai, Tamil Nadu, India. As a Developer, you will be responsible for the hands-on development and support of OBIEE Reports using Oracle Business Intelligence (BI) Publisher along with XML. The ideal candidate should have at least 4 years of experience in this field. Additionally, it would be beneficial to have hands-on experience with Informatica ETL data integration and data movement design, PL/SQL experience, and understanding of Relational Databases like Oracle Exadata 12c. Experience with cloud native data warehousing and data lake solutions, Oracle Analytics Server report development, and SnapLogic workflow development is also preferred. Candidates are expected to have a minimum of 4 to 9 years of experience in the key skills mentioned above. Good communication skills are a must, and the candidate should be willing to work in a 10:30 AM to 8:30 PM shift. Flexibility to work in client locations in Chennai or Bangalore is required, and the candidate must be open to working in a hybrid office environment. Before submitting profiles, candidates must ensure they have genuine and digitally signed Form16 for all employments, with complete employment history details present in UAN/PPF statements. Candidates should have real work experience on the mandatory skills mentioned in the job description and a notice period of 0 to 3 weeks. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure, part of the NTT Group investing significantly in R&D to support organizations and society in moving confidently into the digital future.,

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

The Senior Commission System Engineer is a specialized technical expert within the Global IT team, responsible for the end-to-end technical design of the Commission System, specifically SAP SuccessFactors Incentive Management (SSFIM) on HANA. This role holds accountability for all technical decisions throughout the project lifecycle, ensuring that the solution architecture aligns with key principles of performance, scalability, maintainability, reliability, security, and compliance with enterprise IT quality standards. Acting as a technical leader, you will collaborate closely with Business Analysts who define and deliver functional specifications as well as with QA teams and Project Managers. This is a hands-on, high-impact role requiring deep technical expertise and leadership. Your exceptional knowledge of software development practices will ensure the delivery of robust, scalable, and high-performing IT solutions. As the Senior Commission System Engineer, you will understand business needs and scenarios to propose and deliver the most fitting solutions. You will provide support during functional requirements definition to ensure that functionality is technically achievable and feasible within project constraints. Your responsibilities will include performing high-level technical impact assessments, assisting in detailed analysis, requirements capture, and project planning. You will configure, design, build, test, and deploy solutions to optimize performance and scalability, ensuring seamless integration with other upstream and downstream systems. Leading the solution and architectural design and implementation of SSFIM to meet business requirements will be a key aspect of your role. In addition, you will support the ongoing maintenance, rule creation, calculations, workflow management, and data administration of SSFIM. As part of a project team, you will define and document the application architecture for various software development projects and maintenance activities. Your role will involve leading technical aspects of systems development for projects and solution support, including developing project plans, test plans, deployment plans, etc. You will also be responsible for defining and organizing development tasks, providing accurate task estimations, mentoring software developers, reviewing code, and identifying technical project risks and issues. About You: - 10+ years of SAP Commissions (Callidus) specialist experience - Technology-related bachelor's degree or equivalent work experience - At least 1 end-to-end implementation experience in SAP Commissions with Oracle/HANA as a backend database - At least 10 years of experience in AMS activities, especially ticket handling - Expertise in configuring compensation plans, loading data on Commissions UI, data integration setup, and advanced SQL and PLSQL queries About Us: Study Group is a leading international education provider dedicated to helping students around the world reach their full potential. With university partnerships and a variety of study programmes, we provide the resources and guidance for student success. Our global network ensures the best educational services, and our Insendi platform delivers innovative digital learning experiences. Organisational Compliance: Study Group maintains high standards in safeguarding and conducts checks such as Criminal Background Check, References, ID and Right to Work checks, and Education Verification. We are committed to equal opportunities and creating a diverse and inclusive workplace based on skills and experience alone. Our rolling recruitment process reviews applications as they are submitted, so apply ASAP to be considered.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data and Solution Architect at our company, you will play a crucial role in participating in requirements definition, analysis, and designing logical and physical data models for various data models such as Dimensional Data Model, NoSQL, or Graph Data Model. You will lead data discovery discussions with the Business in Joint Application Design (JAD) sessions and translate business requirements into logical and physical data modeling solutions. It will be your responsibility to conduct data model reviews with project team members and capture technical metadata using data modeling tools. Your expertise will be essential in ensuring that the database designs efficiently support Business Intelligence (BI) and end-user requirements. You will collaborate closely with ETL/Data Engineering teams to create data process pipelines for data ingestion and transformation. Additionally, you will work with Data Architects for data model management, documentation, and version control. Staying updated with industry trends and standards will be crucial in driving continual improvement and enhancement of existing systems. To excel in this role, you must possess strong data analysis and data profiling skills. Your experience in conceptual, logical, and physical data modeling for Very Large Database (VLDB) Data Warehouse and Graph DB will be highly valuable. Hands-on experience with modeling tools like ERWIN or other industry-standard tools is required. Proficiency in both normalized and dimensional model disciplines and techniques is essential. A minimum of 3 years" experience in Oracle Database along with hands-on experience in Oracle SQL, PL/SQL, or Cypher is expected. Exposure to tools such as Databricks Spark, Delta Technologies, Informatica ETL, and other industry-leading tools will be beneficial. Good knowledge or experience with AWS Redshift and Graph DB design and management is desired. Working knowledge of AWS Cloud technologies, particularly on services like VPC, EC2, S3, DMS, and Glue, will be advantageous. You should hold a Bachelor's degree in Software Engineering, Computer Science, or Information Systems (or equivalent experience). Excellent verbal and written communication skills are necessary, including the ability to describe complex technical concepts in relatable terms. Your ability to manage and prioritize multiple workstreams confidently and make decisions about prioritization will be crucial. A data-driven mentality, self-motivation, responsibility, conscientiousness, and detail-oriented approach are highly valued. In terms of education and experience, a Bachelor's degree in Computer Science, Engineering, or relevant fields along with 3+ years of experience as a Data and Solution Architect supporting Enterprise Data and Integration Applications or a similar role for large-scale enterprise solutions is required. You should have at least 3 years of experience in Big Data Infrastructure and tuning experience in Lakehouse Data Ecosystem, including Data Lake, Data Warehouses, and Graph DB. Possessing AWS Solutions Architect Professional Level certifications will be advantageous. Extensive experience in data analysis on critical enterprise systems like SAP, E1, Mainframe ERP, SFDC, Adobe Platform, and eCommerce systems is preferred. If you are someone who thrives in a dynamic environment and enjoys collaborating with enthusiastic individuals, this role is perfect for you. Join our team and be a part of our exciting journey towards innovation and excellence!,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As an experienced SQL Developer with 5 to 8 years of experience, located in Gurgaon, you will be responsible for working on exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. Your role will involve collaborating with a diverse team of highly talented individuals in an open and laidback environment, both locally and potentially abroad in global centers or client facilities. Your primary responsibilities will include writing complex SQL queries, utilizing your expertise in SQL Server and PL/SQL, and demonstrating familiarity with Informatica ETL basic concepts. Additionally, you will have the opportunity to optimize database performance and gain exposure to AWS technologies. At GlobalLogic, we prioritize work-life balance and offer flexible work schedules, opportunities for remote work, paid time off, and holidays. We are committed to your professional development and provide resources for enhancing communication skills, stress management, professional certifications, and technical and soft skill training. You will enjoy competitive salaries, family medical insurance, group term life insurance, group personal accident insurance, National Pension Scheme (NPS), health awareness programs, extended maternity leave, annual performance bonuses, and referral bonuses. Our vibrant offices feature dedicated zones, rooftop decks, and a club where you can socialize with colleagues over coffee or tea, along with various perks such as sports events, cultural activities, food at subsidized rates, and corporate parties. As a part of GlobalLogic, a leader in digital engineering, you will have the opportunity to collaborate with global brands and leaders to design and build innovative products, platforms, and digital experiences. By combining experience design, complex engineering, and data expertise, we help clients envision the future of digital businesses and accelerate their digital transformation. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, catering to customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology. As a Hitachi Group Company under Hitachi, Ltd., we are dedicated to driving innovation through data and technology to create a sustainable society with a higher quality of life.,

Posted 3 weeks ago

Apply

14.0 - 20.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Job Description: Job Title: Data Technical Lead As the Data Management Platform (DMP) Technical Lead , you will be responsible for embedding a world class product development and engineering culture and organization. You will work with development, architecture and operations as well as platform teams to ensure we are delivering a best-in-class technology solution. You will work closely together with the Business Platform Owner to ensure an integrated end-to-end view across people and technology for the Business Platform. You will also defend the faith and work with stakeholders across the enterprise to ensure we are developing the right solutions. In parallel, you will focus on building a high-performing team that will thrive in a fast-paced continuous delivery engineering environment The role involves architecting, designing, and delivering solutions in tool stack including Informatica MDM SaaS, Informatica Data Quality, Collibra Data Governance , and other data tools. Key responsibilities: Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform/portfolio level Key Experience: B.E. / B.Tech or equivalent Engineering professional Masters degree or equivalent experience in Marketing, Business or finance is an added advantage 10+ yrs. of experience in technical architecture, solution design, and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau , and Data virtualization tools Good experience in ETL and BI solution development and tool stack – Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR . Proven knowledge of working in financial services, preferably, insurance space Experience in senior engineering and technology roles working with teams to build/deliver digital products Experience in providing guidance and insight to establish governance processes, direction, and control, to ensure objectives are achieved and risks are managed appropriately for product development A leader who has a track record of onboarding, and developing engineering and product teams Experience as a technology leader who has defined and implemented technical strategies within complex organizations and is able to Influence and contribute to the higher-level engineering strategy Has insight into the newest technologies and trends and is an expert in product development with experience in code delivery and management of full stack technology Experience in digital capabilities such as DevSecOps, CI/CD, and agile release management A wide experience and understanding of architecture in terms of solution, data, and integration Can provide direct day-to-day engineering and technology problem-solving, coaching, direction, and guidance to Technical Leads and Senior Technical Leads within their Platform Strong leadership skills with an ability to influence a diverse group of stakeholders Ability to influence technical strategy at the BG and Enterprise level Experience working in Agile teams with a strong understanding of agile ways of working Experience managing technical priorities within an evolving product backlog Understands how to decompose large technical initiatives into actionable technical enablers Experience in the continuous improvement of software development workflows and pipelines Proven leadership; ability to articulate ideas to both technical and non-technical audiences Ability to communicate strategy and objectives, and align organizations to a common goal Strong problem solver with ability to lead the team to push the solution Ability to empower teams and encourage collaboration Ability to inspire people and teams, building momentum around a vision Critical thinker and passion to challenge status quo to find new solutions and drive out of the box ideas Believes in a non-hierarchical culture of collaboration, transparency and trust across the team Experimental mindset to drive innovation and continuous improvement of team

Posted 1 month ago

Apply

6.0 - 10.0 years

18 - 22 Lacs

Hyderabad

Work from Office

SR. Informatica Developer _ Reputed US based IT MNC If you are meeting the criteria, Email your CV to jagannaath@kamms.net Experience : 6 Years + Role : Sr. Informatica Developer with IICS Position Type: Full time/ Permanent Location : Hyderabad ( Work from Office) Salary: As per your experience Responsibilities: 6+ years hands on experience with Informatica Power Center and minimum 3 years experience in using Intelligent Cloud Services (IICS) Identify and understand the requirement and design, Profile & Analyze Source Data (Mainframe Extracts, flat files, SQL Server, web Services etc.) usage using Informatica ETL. Able to create Mapplets Analyzing and evaluating data sources, data volume, and business rules. Nice to have Snowflake experience Develop ETL components using Informatica intelligent cloud services Extract data, create data models. Automation, job scheduling, dependencies, monitoring. Extraction, transformation, and load of data using the ETL tools. Configure/Script Business rules and transformation rules in Informatica. Experience in understanding complex stored procedures and be able enhance the procedures based on the requirements to optimize the code. Expertise in developing a strategy to transform the stored procedures logic into ETLs using IICS as the ETL tool. Defect fixes for System Test, Operational Acceptance Test and Production Ability to work in agile team and communicate the impediments and getting resolved. Expertise in SQL is MUST

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies