Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
15 - 25 Lacs
Chennai, Bengaluru
Work from Office
Job Profile: We are seeking an experienced Informatica BDM Developer to join our data engineering team. The ideal candidate will act as a primary point of contact for technical support, troubleshoot complex issues across big data platforms, and deliver solutions efficiently. This role requires hands-on expertise in Informatica BDM, Cloudera, Spark, and Python, along with a solid understanding of ITIL processes. Serve as the first point of contact for customers seeking technical assistance via phone, email, or ITIL tools. Diagnose and resolve medium-complexity technical issues in Informatica BDM, Cloudera, Spark, Python, and related data engineering technologies. Perform remote troubleshooting using diagnostic techniques and detailed questioning. Determine the best solution based on the issue and customer-provided details, ensuring high customer satisfaction. Escalate unresolved issues to higher support tiers when necessary. Walk customers through problem-solving processes and provide step-by-step technical help. Document all support activities including issues, resolutions, and actions taken in logs. Support BAU (Business As Usual), DPO, and DPS services with accurate and timely responses. Carry out incident management tasks including configuration, basic-to-medium tuning, and operational support in low-risk environments. Create and maintain operational documentation and incident/change records. Mentor junior team members and assist in knowledge transfer. Collaborate with infrastructure teams to coordinate maintenance activities and ensure system stability. Apply ITIL v3 methodologies for effective incident, problem, and change management. Candidates Profile BE/B Tech, BCA/MCA with 5+ years experience with Informatica BDM and PowerCenter . Ready for 6 months contract role in Chennai in Hybrid mode Can join within 15 days Proficient in Cloudera (Hadoop), Spark , and Python . Understanding of ETL processes , data pipelines , and big data architectures . Familiarity with incident management and ITIL-based support workflows. Excellent problem-solving skills with a proactive mindset. Strong verbal and written communication skills. Ability to handle pressure and resolve customer issues effectively.
Posted 1 month ago
7.0 - 9.0 years
14 - 20 Lacs
Pune
Work from Office
Strong proficiency in SQL (Structured Query Language)querying, manipulating and optimizing dataExperience in ETL development informatica Data warehousing, ADF,GCP,Databricks Extensive experience with popular ETL tools Required Candidate profile 2+ years of experience in Infomatica, complex SQL queries SQL: Oracle, MS SQL, Teradata, Netezza • ETL: Informatica Power Centre (Must) • Cloud: ADF OR Databricks OR GCP or Google Data Proc
Posted 1 month ago
6.0 - 10.0 years
1 - 1 Lacs
Bengaluru
Remote
We are looking for a highly skilled Senior ETL Consultant with strong expertise in Informatica Intelligent Data Management Cloud (IDMC) components such as IICS, CDI, CDQ, IDQ, CAI, along with proven experience in Databricks.
Posted 1 month ago
4.0 - 9.0 years
10 - 15 Lacs
Mumbai
Work from Office
Hiring ETL SQL Database Developer (4–9 Yrs) – Mumbai. Design, develop & optimize ETL pipelines, write advanced SQL queries, and manage large-scale databases. Experience with tools like Informatica,SSIS required. Strong DB performance tuning skills.
Posted 1 month ago
3.0 - 8.0 years
20 - 30 Lacs
Hyderabad, Pune
Hybrid
Job Summary: oin our team and what well accomplish together As an MDM Developer, you will be responsible for implementing and managing Master Data Management (MDM) projects. The ideal candidate will have extensive experience with Informatica MDM and proficiency in configuring MDM tools and integrating them with cloud environments. You will utilize your expertise in data engineering to build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS). Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice. Here’s how Learn new skills & advance your data development practice Analyze and profile data Design, develop, test, deploy, maintain and improve batch and real-time data pipelines Assist with design and development of solution prototypes Support consumers with understanding the data outcomes and technical design Collaborate closely with multiple teams in an agile environment What you bring You are a senior developer with 3+ years of experience in IT platform implementation in a technical capacity Bachelor of Computer Science, Engineering or equivalent Extensive experience with Informatica MDM (Multi-Domain Edition) version 10 Proficiency in MDM configuration, including Provisioning Tool, Business Entity Services, Customer 360, data modeling, match rules, cleanse rules, and metadata analysis Expertise in configuring data models, match and merge rules, database schemas, and trust and validation settings Understanding of data warehouses/cloud architectures and ETL processes Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.) Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools Experience with Informatica MDM (preferred) but strong skills in other MDM tools still an asset Experience working with message queues like JMS, Kafka, PubSub A passion for data quality Great-to-haves Experience with Informatica MDM SaaS Experience with Python and software engineering best practices API development using Node.js and testing using Postman/SoapUI Understanding of TMF standards
Posted 1 month ago
7.0 - 12.0 years
25 - 27 Lacs
Kolkata, Bengaluru
Hybrid
Required Experience: • Design, develop, and maintain ETL/ELT workflows using Informatica IICS. • Collaborate with business and technical teams to understand requirements and translate them into robust data integration solutions. • Optimize data pipelines for performance and scalability. • Integrate IICS solutions with cloud-based data stores like Google BigQuery and cloud storage solutions. • Develop data mappings, task flows, parameter files, and reusable objects. • Manage deployments, migrations, and version control for IICS assets. • Perform unit testing, debugging, and troubleshooting of ETL jobs. • Document data flow and architecture as part of the SDLC. • Work in an Agile environment and participate in sprint planning, reviews, and retrospectives. • Provide mentorship and code reviews for junior developers, ensuring adherence to best practices and coding standards. Skills & Qualifications: • Bachelors or Master’s degree in Computer Science, Information Systems, or related field. • 7+ years of experience in ETL development with at least 2–3 years in Informatica IICS. • Strong experience in data integration, transformation, and orchestration using IICS. • Good working knowledge of cloud data platforms, preferably Google Cloud Platform (GCP). • Hands-on experience with Google BigQuery (GBQ) including writing SQL queries, data ingestion, and optimization. • Strong SQL skills and experience with RDBMS (e.g., Oracle, SQL Server, PostgreSQL). • Experience in integrating data from various sources including on-prem, SaaS applications, and cloud data lakes. • Familiarity with data governance, data quality, and data cataloging tools. • Understanding of REST APIs and experience with API integration in IICS. • Excellent problem-solving skills and attention to detail. • Strong communication skills and the ability to work effectively in a team.
Posted 1 month ago
10.0 - 20.0 years
0 - 0 Lacs
Hyderabad, Bengaluru
Hybrid
table {mso-displayed-decimal-separator:"\."; mso-displayed-thousand-separator:"\,";} tr {mso-height-source:auto;} col {mso-width-source:auto;} td {padding-top:1px; padding-right:1px; padding-left:1px; mso-ignore:padding; color:black; font-size:11.0pt; font-weight:400; font-style:normal; text-decoration:none; font-family:"Aptos Narrow", sans-serif; mso-font-charset:0; text-align:general; vertical-align:bottom; border:none; white-space:nowrap; mso-rotate:0;} .xl73 {border:.5pt solid black; white-space:normal;} 10+ years of experience in Quality Assurance, focusing on ETL testing, data validation, and data migration projects. * Proven experience creating detailed test cases, test plans, and test scripts. * Hands-on experience with ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage. * Proficiency in SQL for complex query writing and optimization for data validation and testing. * Experience with cloud data migration projects, specifically working with databases like Snowflake. * Strong understanding of semi-structured data formats like JSON and XML, with hands-on testing experience. * Proven ability to lead QA efforts, manage teams, and coordinate with on-shore and off-shore teams effectively. * Strong analytical and troubleshooting skills for resolving data quality and testing challenges. Preferred Skills: * Experience with automated testing tools and frameworks, particularly for ETL processes. * Knowledge of data governance and data quality best practices. * Familiarity with AWS or other cloud-based ecosystems. * ISTQB or equivalent certification in software testing.
Posted 1 month ago
7.0 - 12.0 years
9 - 18 Lacs
Pune, Chennai, Coimbatore
Hybrid
Role Description: The Informatica/ETL PowerCenter Developer would need to have at least 7+ years of experience. Responsibilities and Qualifications: Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Performs source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Works with Informatica Data Quality (Analyst and Developer) Primary skill is Informatica PowerCenter
Posted 1 month ago
6.0 - 8.0 years
13 - 23 Lacs
Bengaluru
Hybrid
Job description Primary skillsets 5 years hands on experience in Informatica PWC ETL development 7 years of experience in SQL analytical STAR schema data modeling and Informatica PowerCenter 5 years of Redshift Oracle or comparable database experience with BIDW deployments Secondary skillsets Good to know cloud like AWS Services Must have proven experience with STAR and SNOWFLAKE schema techniques Good to know cloud like AWS Services Proven track record as an ETL developer potentially to grow as an Architect leading development teams to deliver successful business intelligence solutions with complex data sources Strong analytical skills and enjoys solving complex technical problems Knowledge on additional ETL tools Qlik Replicate End to End understanding of data from ingestion to transformation to consumption in Analytics will be great benefits
Posted 1 month ago
3.0 - 8.0 years
20 - 30 Lacs
Hyderabad, Pune
Hybrid
Job Summary: We are seeking a highly skilled Informatica MDM Developer to join our data integration and management team. The ideal candidate will have extensive experience in Informatica Master Data Management (MDM) solutions and a deep understanding of data quality, data governance, and master data modeling. Key Responsibilities: Design, develop, and deploy Informatica MDM solutions (including Hub, IDD, SIF, and MDM Hub configurations). Work closely with data architects, business analysts, and stakeholders to understand master data requirements. Configure and manage Trust, Merge, Survivorship rules, and Match/Merge logic. Implement data quality (DQ) checks and profiling using Informatica DQ tools. Develop batch and real-time integration using Informatica MDM SIF APIs and ETL tools (e.g., Informatica PowerCenter). Monitor and optimize MDM performance and data processing. Document MDM architecture, data flows, and integration touchpoints. Troubleshoot and resolve MDM issues across environments (Dev, Test, UAT, Prod). Support data governance and metadata management initiatives. Required Skills: Strong hands-on experience with Informatica MDM (10.x or later) . Proficient in match/merge rules , data stewardship , hierarchy management , and SIF APIs . Experience with Informatica Data Quality (IDQ) is a plus. Solid understanding of data modeling , relational databases , and SQL . Familiarity with REST/SOAP APIs , web services , and real-time data integration. Experience in Agile/Scrum environments. Excellent problem-solving and communication skills.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Hello everyone! PwC India is inviting applicants for the role below, kindly apply if found suitable! JD ETL Developer Work Location: Mumbai Years of experience: Around 5 to 10 years Level: Senior Assocaite , Tech leads , Mangers . Work Model - Work from office Mandatory Skill Set-ETL Informatica IICS or Power center Job Description: Location: Mumbai Full job description Skill: Informatica PowerCenter or IICS or ODI or Talend - 5 - 10 years of professional experience with working knowledge in a Data and Analytics role with a Global organization -Mandatory Hands-on Development Experience in ETL tool : Informatica Power center OR IICS or ODI or Talend -Should have implemented end to end ETL life cycle, preparing ETL design frameworks and execution -Must have rich experience building Operational Data stores, Data marts and Enterprise Data warehouse -Must have very good SQL skills ( Specifically in Oracle, MySQL, Postgre) -Should be able to create and execute ETL designs, and test cases should be able to write complex SQL queries for testing/analysis depending upon the functional/technical requirement. -Should have worked on performance optimization, error handling, writing stored procedures, etc. -Demonstrate ability to communicate effectively with both Technical & Business stakeholders - Should have good knowledge of SCD Type 1, SCD Type 2 concepts. -Should have Understanding of data modelling, Data warehousing concepts Pease lfill the application form below. https://lnkd.in/gVTyzjmu
Posted 1 month ago
5.0 - 8.0 years
15 - 25 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing!! Role: Informatica Developer Experience Required :5 to 8 yrs Work Location : Chennai Required Skills, Informatica powercenter Interested candidates can send resumes to nandhini.spstaffing@gmail.com
Posted 1 month ago
5.0 - 7.0 years
1 - 1 Lacs
Hyderabad
Remote
Databricks and Informatica Intelligent Data Management Cloud (IDMC) consultant
Posted 1 month ago
5.0 - 7.0 years
1 - 1 Lacs
Hyderabad
Remote
Databricks and Informatica Intelligent Data Management Cloud (IDMC) consultant
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).
Posted 1 month ago
6.0 - 10.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Job Information Job Opening ID ZR_1762_JOB Date Opened 21/03/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Informatica City Hyderabad Province Telangana Country India Postal Code 500081 Number of Positions 1 Roles & Responsibilities: This position requires somebody with a strong Informatica experience to be part of a Application support and maintenance project. Requirement gathering from clients and working with Architects to propose design for ETL Solutions Designed ETL via source-to-target mapping and designed documents that consider best practices and performance Developed Source to target in Informatica power center mapping Expert skills in creating Informatica mapping sessions, workflow, mapplet and work lets. Worked on error handling Tested ETL and other technical components and support UAT/Test activities Deployed and tested code from lower environments to production Designed ETL via source-to-target mapping and design documents that consider best practices and performance Expertise in ETL design, ETL and database code developments, scheduling, Data delivery and Reconciliation. Experience in ETL code development, Database code developments, SQL writing. Effective communicator, confident liaising with business / senior stakeholders. Provide support L1/L2 support for Informatica Job failures. Experience and Qualifications: 6 + years of experience in ETL Informatica Power Center Development and production support. Core skills in Oracle Database Experience in design and development of Informatica mapping & workflows. Experience working with enterprise job scheduler and monitoring tools. Experience in working with SQL / PLSQL scripting. Extensive experience to optimize the performance of mapping, workflows to match the assigned execution windows. Team player and mentor junior support resource. Experience in control-M scheduler tool preferred check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2334_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Informatica ETL Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
2.0 - 7.0 years
5 - 10 Lacs
Pune, Bangalore Rural
Hybrid
Analyse and develop SSRS, SSAS, SSMS, SSIS reports, Power BI reports & dashboards. Exp in Power Bi, Excel, SQL Database, MS Azure, Data Analytics/Visualization, PowerBi Tools, ETL package using Visual Studio or Informatica, Macros, Python, R language Required Candidate profile 3-5 yrs Exp on PowerBI, query design tools, Databricks, SQL, Data Validation, DAX queries/functions, MS Azure, Power BI Desktop, PBI Service, SSRS, SSAS, ETL reporting & Pkg, Dashboards & Database
Posted 2 months ago
8.0 - 13.0 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Greetings from LTIMindtree !!! Dear Candidate Wonderful Job opportunity for Advanced Informatica Power Center I Saw your profile on Naukri I was really impressed by your experience as a Advanced Informatica Data Quality As you know LTIMindtree is growing rapidly and want to be a part of this journey. We are currently looking someone like you to join back our team. I?d love to tell you a little more about this position and learn a few things about you, as well. If you are interested, please share your updated CV & click the below link and provide required details to proceed further: Interested Candidates Please fill the form form Quick Responses https://forms.office.com/r/HQ5yLBHY7v
Posted 2 months ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills.
Posted 2 months ago
7.0 - 12.0 years
0 Lacs
Hyderabad
Work from Office
Total Experience: 7+ Years Relevant Experience: 7 Years in Informatica and TIBCO Job Description: We are looking for a highly motivated and experienced Senior ETL Developer with strong hands-on experience in Informatica and TIBCO . The ideal candidate will have deep technical expertise in ETL design and development, along with strong problem-solving and analytical skills. Experience in working with distributed teams and knowledge of data warehousing principles is essential. Key Responsibilities: Design, develop, and maintain ETL solutions using Informatica PowerCenter and TIBCO . Write and optimize complex SQL and PL/SQL queries to support data extraction, transformation, and loading. Collaborate with cross-functional teams to gather requirements and deliver data solutions that support business objectives. Work effectively within an onshore/offshore delivery model. Ensure best practices in ETL development, including performance tuning and data quality assurance. Apply dimensional data modeling concepts such as star and snowflake schemas using Kimball methodology. Participate in code reviews, testing, and deployment activities. Maintain documentation and ensure compliance with data governance policies. Mandatory Skills: 7+ years of experience in ETL development using Informatica Strong hands-on experience in TIBCO Proficient in SQL/PLSQL Understanding of data warehousing concepts and dimensional modeling Excellent problem-solving, communication, and interpersonal skills Good to Have: Experience with any BI reporting tools (e.g., Tableau, Power BI, Cognos) Exposure to Agile methodologies Familiarity with industry-standard ETL and data quality tools Work Environment: Opportunity to work in a diverse, collaborative, and fast-paced global team Exposure to enterprise-level projects with onshore-offshore delivery models
Posted 2 months ago
8.0 - 12.0 years
19 - 20 Lacs
Hyderabad
Work from Office
Role & responsibilities Mandatory skill-Informatica,SQL,Teradata,Oracle 8 plus years of Strong ETL Informatica experience. Should have Oracle, Hadoop, MongoDB experience. Strong SQL/Unix knowledge. Experience in working with RDBMS. Preference to have Teradata. Good to have Bigdata/Hadoop experience. Good to have Python or any programming knowledge. Preferred candidate profile
Posted 2 months ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 2 months ago
7 - 12 years
20 - 35 Lacs
Bengaluru
Hybrid
A bachelors degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design. Experience in writing stored procedures, optimization, and performance tuning?Strong Technology acumen and a deep strategic mindset. Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members. Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required Familiarity with Postgres and Python is a plus
Posted 2 months ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
Required Experience 5 - 8 Years Skills ETL Informatica , Power BI img {max-height240px;} Roles and Responsibilities: BI Development & Data Visualization Design, develop, and optimize BI dashboards and reports using Power BI and Tableau, ensuring high-performance and visually appealing analytics. Build advanced data visualizations, KPI scorecards, and interactive reports that provide real-time insights for business users. Ensure the usability, scalability, and efficiency of all BI solutions, aligning them with business objectives. Automate data extraction and processing workflows using Alteryx, improving efficiency and reducing manual effort. Work closely with stakeholders from finance, sales, operations, and risk teams to translate business requirements into impactful BI solutions. Standardize and document BI processes, ensuring consistency and usability across different teams. ETL Development & Data Engineering Design and implement ETL workflows using Alteryx to automate, transform, and optimize data pipelines. Develop and maintain data models, ensuring efficiency, accuracy, and scalability for large datasets. Optimize SQL queries and database structures to improve data retrieval speeds and enhance system performance. Integrate and manage SAP BEx, SAP BO, and SAP BW data sources for effective reporting and analysis. Monitor data quality and consistency, ensuring data governance and compliance with industry standards. Performance Optimization & Troubleshooting Continuously monitor, troubleshoot, and enhance BI dashboards and reports for optimal performance and usability. Conduct root-cause analysis for data discrepancies and implement data validation techniques to maintain accuracy. Evaluate and benchmark BI tools and performance metrics, identifying areas for process automation and improvement. Collaboration, Training & Best Practices Work closely with senior business leaders, understanding strategic goals and aligning BI solutions accordingly. Provide training and support to business users on BI tools and self-service analytics capabilities. Stay updated with emerging BI trends, best practices, and technologies, continuously improving internal BI frameworks. Act as a mentor and subject matter expert, providing guidance to junior BI engineers and analysts. Collaborate with data engineering and IT teams to enhance the data architecture and improve analytics capabilities. Critical Skills to Possess: Bachelor's degree in Computer Science, Data Analytics, Business Intelligence, Information Systems, or a related field. 5+ years of experience in BI development, data visualization, and analytics, with a strong track record of delivering high-impact BI solutions. Strong expertise in Power BI, Alteryx, and Tableau for dashboard development, analytics, and automation. Proficiency in SQL, with hands-on experience in query optimization and database performance tuning. Hands-on experience with ETL tools and data automation techniques. Excellent problem-solving and analytical skills, with the ability to interpret complex datasets and provide meaningful insights. Excellent communication and stakeholder management skills, with the ability to present complex data insights in a clear and actionable manner. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Sign in to apply Share this job
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough