Home
Jobs
Companies
Resume

36 Etl Informatica Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8 - 11 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Having 8+ yrs. of Analysis, Design, Development, Testing and Implementation experience in the Informatica ETL Tool (Power Center or Data Quality) • Expertise in implementing complex business rules by creating reusable transformations and mapplets /mappings. • Expertise in Data Remediation activities • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies. • Proficient in UNIX commands and scripting • Good Experience in development of database packages using SQL/PLSQL Band: U4 Competency: Data & Analytics

Posted 3 months ago

Apply

6 - 11 years

12 - 20 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Dear Candidate, Greeting for the day! Job description Responsible for delivering highly available heterogeneous database servers on multiple technology platforms. Strong in SQL Knowledge in Python with ETL. Lead all database maintenance and tuning activities, ensuring continued availability, performance and capacity of all database services across every business application and system. Is expected to consider current practices to develop innovative and reliable solutions that will continuously improve the service quality to the business. Create/Update/Maintain data warehouse for business/product reporting needs. Refine physical database design to meet system performance requirements. Identify inefficiencies in current databases and investigate solutions. Diagnose and resolve database access and performance issues. Develop, implement, and maintain change control and testing processes for modifications to databases. Ensure all database systems meet business and performance requirements. Coordinate and work with other technical staff to develop relational databases and data warehouses. Advocates and implements standards, best practices, technical troubleshooting processes, and quality assurance Develop and maintain database documentation, including data standards, procedures, and definitions for the data dictionary. Produce ad-hoc queries and develop reports to support business needs. Creation and maintenance of technical documentation. Perform other management assigned tasks as required. Qualifications Must Haves Bachelors or Masters degree in Computer Science, Mathematics or other STEM discipline 3+ years of experience in working with relational databases. (e.g. Redshift, PostgreSQL, Oracle, MySQL) 2+ years of experience with NoSQL database solutions (e.g. MongoDB, DynamoDB, Hadoop/HBase etc.) 3+ years of experience with ETL/ELT tools (e.g. Talend/ Informatica, AWS Data Pipeline. Preferably on Talend.) Strong Knowledge on Data warehousing Basics and relational database management Systems and Dimensional modelling (Star schema and Snowflake schema) Configuration of ETL ecosystems and perform regular data maintenance activities such as data loads, data fixes, schema updates, database copies, etc. Experienced in data cleansing, enterprise data architecting, data quality and data governance Good understanding of Redshift Database design using distribution style, sorting, encoding features Working experience with cloud computing technologies AWS EC2, RDS, Data Migration Service (DMS), Schema Conversion Tool (SCT), AWS Glue Well versed in advanced query development and design using SQL, PL/SQL, Query Optimization, performance and tuning of applications on various databases Supporting multiple DBMS platforms in Production/QA/UAT/DEV in both on premise and AWS cloud environments Strong Pluses Experience with database partitioning strategies on various databases (PostgreSQL, Oracle) Experience in migrating, automating and supporting a variety of AWS hosted (Both RDBMS & NoSQL) databases in RDS, EC2 using CFT Experience with Big Data Technology Stack: Hadoop/Spark/Hive/MapR/Storm/Pig/Oozie/Kafka, etc. Experience with shell scripting for process automation Experience with source code versioning with Git and Stash Ability to work across multiple projects simultaneously Strong experience in all aspects of the software lifecycle including design, testing, and delivery Ability to understand and start projects quickly Ability and willingness to work with teams located in multiple time zones Regards, Sushma A

Posted 3 months ago

Apply

4 - 6 years

10 - 15 Lacs

Bangalore Rural

Hybrid

Naukri logo

Hi, We are looking for an ETL informatica Developer Skills: Snowflakes,Control-M ,Phython/Java Exp: 4 to 6 Location: Bangalore NP: Immediate within 15 days Interested candidate send resume to sreeram.sekhar@thakralone.in

Posted 3 months ago

Apply

4 - 6 years

10 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Hi, We are looking for an ETL informatica Developer Skills: Snowflakes,Control-M ,Phython/Java Exp: 4 to 6 Location: Bangalore NP: Immediate within 15 days Interested candidate send resume to sreeram.sekhar@thakralone.in

Posted 3 months ago

Apply

6 - 11 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Job Position: Informatica/DM Express Experience: 5+ years Location: Pune Notice Period: Immediate joiner Mandatory Skills Description: 5+ years' experience in Informatica/DM Express ETL Tools. 5+ years' experience SQL Server/DB2/Teradata, should be able to design complex procedure, query performance optimization. 3+ years experience in Unix Shell Scripting. SQL for data profiling, and strong knowledge for navigating databases and data structures, experience with operational data stores and/or enterprise data warehouses preferred Have experience in GCP(Big Query) for Query designing and ETL workflow design. Should have reporting tools experience. Should have experience working in an application development building and supporting downstream SAP Business Objects reporting solutions Must have played the functional role in gathering requirements and working with business users. Experience gathering/analysing requirements and translating them into detailed functional specifications. Strong analytical and problem-solving skills. Strong written and oral communication skills and an open communication style. Strong execution skills with the ability to work efficiently and deliver quality results within standards to meet deadlines.

Posted 3 months ago

Apply

8 - 13 years

12 - 22 Lacs

Hyderabad

Remote

Naukri logo

Job Description: Anaplan Architect - Remote Opportunity (8+ Years of Experience) Position Overview: We are seeking a highly skilled and experienced Anaplan Architect to join our team in a remote capacity. As an Anaplan Architect, you will play a pivotal role in designing, implementing, and optimizing Anaplan solutions to meet the business planning and forecasting needs of our clients. You will collaborate with cross-functional teams, lead solution architecture, and ensure best practices in model design and implementation. This position requires deep expertise in Anaplan, strong problem-solving skills, and excellent communication abilities to work effectively in a client-facing environment. Roles and Responsibilities: Solution Architecture: Design scalable and efficient Anaplan models to support business planning, forecasting, and performance management processes. Define and implement best practices in Anaplan model architecture, ensuring optimal performance and usability. Model Development and Implementation: Lead the end-to-end development of Anaplan models, including data integration, model building, and testing. Ensure alignment of Anaplan solutions with business objectives and requirements. Optimize existing models for performance, maintainability, and scalability. Client Engagement: Act as a trusted advisor to clients, understanding their business needs and translating them into effective Anaplan solutions. Conduct workshops and presentations to demonstrate the capabilities and value of Anaplan. Team Leadership and Mentoring: Provide guidance and mentorship to Anaplan model builders and junior team members. Collaborate with cross-functional teams, including finance, operations, and IT, to deliver integrated planning solutions. Data Integration and Governance: Design and implement data integration strategies between Anaplan and other systems (e.g., ERP, CRM, BI tools). Establish data governance practices to ensure data accuracy and consistency within Anaplan models. Documentation and Support: Create and maintain comprehensive documentation for model architecture, processes, and best practices. Provide post-implementation support and troubleshooting for Anaplan solutions. Required Technical Skill Set: Anaplan Expertise: Minimum 8 years of experience in Anaplan implementation, with at least 2 years in a solution architect role. Proficiency in Anaplan model building, performance optimization, and data integration. Planning and Forecasting: Strong understanding of financial planning, demand planning, supply chain, and workforce planning processes. Data Integration and Analysis: Experience with ETL tools (e.g., Informatica, MuleSoft) and integration with ERP/CRM systems (e.g., SAP, Salesforce). Ability to work with large datasets and develop efficient data workflows. Programming and Scripting: Familiarity with programming languages like Python or SQL is a plus. Collaboration and Tools: Experience with Agile methodologies and tools like Jira or Confluence. Proficiency in Microsoft Office Suite, especially Excel. Preferred Skills and Certifications: Certifications: Certified Anaplan Solution Architect (required). Certified Master Anaplanner (preferred). Additional Tools: Knowledge of BI tools like Power BI, Tableau, or Qlik. Familiarity with cloud platforms like AWS or Azure. Required Qualifications: Bachelors or Masters degree in Finance, Business Administration, Computer Science, or related fields. 8+ years of experience in Anaplan implementation and architecture, with a proven track record of delivering successful planning solutions.

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Experienced IMS Production Support L2 to provide technical support and resolve production issues. Minimum of 7+ years of work experience in a similar role and be familiar with hybrid work mode. Should have experience in Abnitio, ETL Informatica, AWS. Experience in models for the transformations & jobs for enhancing the entire flow of ETL Review of transformation job coding Good Knowledge in Databases SQL, Oracle Able to write the SQL queries Able to write the UNIX commands good hands on experience on Control-M, Hadoop, Snowflake, Splunk n Dynatrace Qualification Experienced IMS Production Support L2 to provide technical support and resolve production issues. Minimum of 7+ years of work experience in a similar role and be familiar with hybrid work mode. Should have experience in Abnitio, ETL Informatica, AWS. Experience in models for the transformations & jobs for enhancing the entire flow of ETL Review of transformation job coding Good Knowledge in Databases SQL, Oracle Able to write the SQL queries Able to write the UNIX commands good hands on experience on Control-M, Hadoop, Snowflake, Splunk n Dynatrace

Posted 3 months ago

Apply

3 - 5 years

8 - 12 Lacs

Gurgaon

Hybrid

Naukri logo

Role & responsibilities Understand the project scope, identify activities/ tasks, task level estimates, schedule, dependencies, risks and provide inputs to Program Lead for review. Lead the analysis, design and development phases to develop a robust ETL solutions which can best meet the business requirements for investments applications. Prepare/Modify design documents (High level design and Detailed design documents) based on business requirements. Suggest changes in design on technical grounds. Coordinate delivery of the assigned tasks with the onshore Partners or/and Business Analyst. Ensure timely notification and escalation of possible issues/problems, options and recommendations for prompt resolution Follow development standards to ensure that code is clear, efficient, logical and easily maintainable. Create and maintain application documentation like Network diagrams, technical project handbook, technical data mapping doc, unit test documents, implementation plan. Ensure SLF Information Security Policies and General Computing Control is compiled to in all situations. Take complete ownership of work assignments and ensure the successful completion of assigned tasks. Ensure all written and verbal communication is clear, understandable and audience appropriate Preferred candidate profile Minimum 3 to 5 years of overall IT experience with Informatica PowerCenter. Minimum 1 year of experience in AWS data services (Glue) and PySpark. Strong knowledge of ETL, relational databases (Microsoft SQL Server, PostgreSQL) Good knowledge of data warehousing concepts Good analytical and problem-solving skills. Working knowledge on job scheduling tools like Control-M, Autosys etc. Experience in GIT / Bitbucket versioning tools. Understanding of DevOps processes. Good hands-on experience with AWS step functions, lambda, SQS, SNS, Redshift and other data services will be preferred.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role : Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Pune, Mumbai, Bengaluru

Work from Office

Naukri logo

Job Summary and Responsibilities About Capability Network: If you are looking for a career with unparalleled global impact, then Accenture invites you to learn more about our rapidly expanding Capability Network. Over 2,000 management consulting and strategy professionals work in the Capability Network at Accenture. Based in a network of prominent locations, Capability Network professionals specialize in providing cutting-edge Industry and Functional expertise and leveraging the power of Accenture to bring measurable value to our clients worldwide. Working closely with our clients in Utilities, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"”strategic, industry, functional, technical"”in a diverse project environment that offers multiple opportunities for career growth. Key responsibilities of the role: The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Process & Functional Expertise:Utilities- Electric Transmission & Distribution (T&D) Enterprise Asset Management, Asset Performance Management, Asset Data Analytics, Risk and Failure Prediction Models, Criticality Assessment, Risk Based Investment Planning. Work with clients to understand how analytics can help their business and define a data strategy Help drive the requirements to assess and select the suitable data architecture and solution blueprints Helping solve key business problems and challenges by enabling an architecture transformation, painting a picture of, and charting a journey from the current state to a "to-be" enterprise environment Data Modelling:Functional understanding of the different Asset health Models, Failure Probability Models, Fault Prediction Models, Asset Criticality, Risk Forecasting, and Investment Models for critical asset classes in the Utilities space. Understanding of the critical asset health parameters for major asset classes in the Electric and Gas utilities space (Such as Power Transformers, Wood Poles, Circuit Breaker, Wind Turbines) Knowledge of Asset optimization, Maintenance & Reliability practices as Reliability Centered Maintenance (RCM), Risk Based Inspection (RBI), Asset Analytics and willingness to learn and grow. Understanding of workflow related to SCADA, PI Historian, OMS, field devices triggers resulting in inspection and maintenance work orders. Understanding of asset and location hierarchical structure for Utilities transmission, distribution, underground cables, gas pipelines Good knowledge and experience in Digital & System Integration projects Guide through the stages of data and analytics transformations Lead program delivery for our utilities clients Motivate and manage teams Provide thought leadership to our internal team and educate peers on leading practices Skills/Experience 5 years experience in a consulting/industry role with utilities exposure delivering data driven capabilities and solutions 3 years of implementation experience in Asset Investment Planning tools (IBM Maximo/APM, Copperleaf, SAP IAM, Bentley APM, Aveva APM, Infor APM, GE Predix) 2 years experience in designing and utilizing analytic solutions in enterprise-wide organizations or in a role defining data architecture for a cloud native solution or journey to cloud Experience guiding and managing a team of practitioners to execute data and analytics solutions, preferably both onshore and offshore Other desired/nice to have skills Proven utilities knowledge and experience delivering data, analytics and AI solutions driving business outcomes Strong Data Management and Data Governance background Strong cloud background in delivering data-driven solutions on Azure, AWS and/or GCP Well versed with leading analytics techniques and confident about how to apply them to generate insights and recommendations Experience guiding the data formulation process and exploratory data analysis Experience or knowledge in any of the following: Data Modeling, Big Data Platforms (e.g. Cloudera, Hortonworks, AWS, Talend, etc.), Data Migration and Quality (using ETL Informatica), MySQL/NoSQL, CQRS Event Sourcing Hadoop, MapReduce, Spark, Pig, Hive technologies, Data Ingestion in Data Lake, Data Storage on key Cloud providers (AWS, Azure, GCP), Redshift, Data Retrieval on Big Data platforms and Interfacing Data Science and Data Visualization tools on Big Data platforms Data Storage Tools & Techniques, Data Archiving Patterns & Techniques, Cloud Backup Architecture and Design, Data Warehouse Tools, Data Conversion & Migration, Master Data Management, Metadata Management, Network Data Management Protocol (NDMP), Informatica MDM, Data Management and Integration, Systems Development Lifecycle (SDLC), Data Modeling Techniques and Methodologies, Database Management, Database Technical Design and Build, Extract Transform & Load (ETL) Tools, Cloud Data Architecture, Data Architecture Principles, SAS DataFlux ETL Tools, Online Analytical Processing (OLAP), Data Processes, Data Architecture Principles, Data Architecture Estimation Certifications in leading APM vendors Products (IBM MAS, Copperleaf, Aveva) Qualifications Qualifications Bachelor Degree in a quantitative field like statistics/econometrics/mathematics/engineering/computer science MBA Degree from Tier-1 College (Preferable)

Posted 3 months ago

Apply

5 - 9 years

1 - 6 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Job description, Hiring for ETL Informatica developer with experience range 5 years & above Mandatory Skills: Network Engineer-F5 Load balancer Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities Responsible for all activities related to the development, implementation, administration, and support of ETL processes for large scale data warehouses using Informatica Power Centre Having Good SQL knowledge, worked on Netezza & Yellow brick would be good Having Basic UNIX shell scripting knowledge In-depth understanding of fundamental Data-Warehousing concepts such as Dimensional Modelling, Star and Snowflake Schemas, Data marts, Security and deployment, FACT and Dimensional tables, Logical and Physical data modelling. Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes. Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Extensive experience in ETL design, development and maintenance using SQL Server SQL, PL/SQL, SQL Loader Well versed in developing the complex SQL queries, unions and multiple tables joins and experience with Views. Experience in database programming in PL/SQL (Stored Procedures, Triggers and Packages). Well versed in UNIX shell scripting.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies