Home
Jobs

2257 Informatica Jobs - Page 41

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Company Description Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We don't just follow trends—we set them and drive change. As a market leader in enterprise analytics and mobility software, we've pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate. But that's not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategy's stock has outperformed every company in S&P 500. Our people are the core of our success. At Strategy, you'll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence. Our corporate values—bold, agile, engaged, impactful, and united—are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employee's contributions are recognized and valued. Join us and be part of an organization that lives and breathes innovation every day. At Strategy, you're not just another employee; you're a crucial part of a mission to push the boundaries of analytics and redefine financial investment. Job Description Job Location: Working Full time from Strategy Pune office. We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS. Key Responsibilities ETL Development and Maintenance : Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals. Maintain existing ETL processes ensuring data accuracy and adequate process performance Data Warehouse Design & Development Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions. Proficiently utilize SQL queries to retrieve and manipulate data as required. Data Quality And Analysis Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality. Implement data quality improvement strategies to ensure the accuracy and reliability of data. Performance Optimization Investigate and resolve database and query performance issues to ensure optimal system functionality. Continuously monitor system performance and make recommendations for improvements. Business Collaboration Collaborate with business users to gather comprehensive data and reporting requirements. Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise. Qualifications Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). Proven expertise in designing, implementing, and managing ETL processes and data warehouses. Proficiency with SQL and experience in optimizing queries for performance. Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Detail-oriented with strong problem-solving capabilities. Additional Information The recruitment process includes online assessments as a first step (English, logic, business) - we send them via e-mail, please check also your SPAM folder Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Job Title: Informatica IICS Developer Experience: 2–4 Years Work Locations: Trivandrum, Kochi, Chennai, Bangalore Job Type: Full-Time Mode of Work: Hybrid (3 Days work from office) Job Summary We are looking for a skilled and motivated Informatica Intelligent Cloud Services (IICS) Developer with 2 to 4 years of experience to join our data engineering team. The ideal candidate should have hands-on experience in developing and maintaining ETL pipelines and data integration solutions in a cloud environment, along with strong skills in SQL, working knowledge of Oracle or Snowflake, and scripting abilities in Unix, PowerShell, or Python. Key Responsibilities Design, develop, and deploy ETL processes using Informatica IICS. Build scalable and efficient data integration workflows for transforming and moving data across systems. Work with Snowflake (or similar platforms) for data warehousing and ensure efficient data load/query performance. Write, test, and optimize complex SQL queries for data extraction, transformation, and validation. Create automation scripts using Unix shell scripting, PowerShell, or Python for operational and support tasks. Monitor and troubleshoot ETL jobs and data pipelines, ensuring high reliability and performance. Collaborate with data architects, analysts, and stakeholders to understand data needs and implement solutions. Participate in code reviews and maintain documentation of workflows and best practices. Mandatory Skills Hands-on experience with Informatica IICS (Informatica Cloud). Proficient in SQL with experience in Oracle or Snowflake databases. Scripting ability in Unix, PowerShell, or Python. Solid understanding of ETL concepts and cloud-based data integration. Good-to-Have Skills Exposure to data warehousing and cloud platforms. Familiarity with CI/CD tools, version control, or deployment pipelines. Knowledge of data governance or metadata management practices. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. 2 to 4 years of experience in ETL development and data integration using Informatica IICS. Why Join Us Work with modern cloud data technologies in a collaborative environment. Be part of real-world enterprise data transformation initiatives. Flexible opportunities to work from Trivandrum, Kochi, Chennai, or Bangalore offices. Skills Informatica,IICS,snowflake,sql Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding of Spark core concepts like RDD s, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python / Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/ Informatica/Ab Initio.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

8 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Hands on experience in Hadoop, Informatica development, 4+ years of expertise in Big data technology stack (We required candidates with all 3 combination) - Serve as the first point of contact for customers seeking technical assistance over the phone or email or ITIL Tool - Has knowledge of the next generation technologies - Should have an understanding of Big Data Ecosystem and various relevant technologies and across: a) data processing frameworks Spark, as well as related tools like Informatica,Talend, Pentaho, Nifi b) data transfer frameworks such as Kafka, Flume, Sqoop; c) data storage such as HDFS and NoSQL data stores d) programming languages such as Scala, Java or Python e) architectural understanding of Hadoop, and the relevant tooling from major Hadoop distribution vendors like Cloudera, Hortonworks f) data warehousing and SQL frameworks - Hive, Impala - Has experience on multiple Big data projects such as Data Lake implementation, EDW augmentation/ETL off-loading, NoSQL based real time data store - Architects solutions by mapping customer business problems to end-to-end technology solutions, seeks out accountability and takes the lead when required - Demonstrates consulting skill with effective communication and engagement with customers (internal and external) - Should be able to drive the architectural detailing of a solution covering functional as well as non-functional requirements such as security, performance and availability aspects. Should be able to clearly provide pros and cons of relevant approaches and technologies as applicable - Demonstrates effectiveness in fostering executive level relationships - Is resourceful, confident under pressure, and has demonstrated skill in both expectation and crisis management - Open source contributions to Big data space as a nice to have attribute - Strong ITIL experience and Service delivery mindset g) Requirement gathering and solution design

Posted 2 weeks ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Chennai

Work from Office

Naukri logo

Ford Credits Tech Team in India is actively seeking a highly skilled and experienced Senior Test Data Management Engineer to join our team. In this crucial role, you will be responsible for defining, implementing, and managing test data strategies and solutions specifically for our complex financial applications. Leveraging your deep expertise with IBM Optim, you will ensure that our testing environments are populated with realistic, compliant, and high-quality data, enabling robust and efficient testing across various phases. Youll play a key role in safeguarding sensitive financial data through effective masking and subsetting techniques while supporting our development and QA teams This is a senior-level position requiring deep technical expertise, strategic thinking, and the ability to mentor others and drive TDM excellence throughout the organization. Must Have: 10+ years of overall experience in IT, with a strong focus on Quality Assurance, Data Management, or Software Engineering. 6+ years of dedicated experience in Test Data Management (TDM). Proven experience implementing and managing TDM solutions for complex enterprise applications, preferably in the financial services industry. Strong Hands-on experience with industry-standard TDM tools like IBM Optim and Opensource tools Experience working in highly regulated environments with a strong understanding of data privacy and compliance challenges in finance. Strong SQL skills and experience working with various relational databases (e.g., Oracle, SQL Server, DB2, PostgreSQL , BQ, etc). Solid understanding of data modeling concepts and database structures. Proficiency in data masking, subsetting, and synthetic data generation techniques. Experience with scripting languages (e.g., Python, Shell, Perl) for automation and data manipulation. Experience with RBAC and have worked with Infra teams to achieve CI/CD automation to produce masked test data from production on demand. Familiarity with Linux/Unix command line. Solid understanding of data refreshers process Solid understanding of financial industry data structures, workflows, and testing challenges (e.g., trading, payments, banking, accounting, regulatory reporting). In-depth knowledge of relevant data privacy regulations (e.g., GDPR, CCPA, etc.) and their impact on test data handling. Excellent analytical and problem-solving skills with the ability to tackle complex data challenges. Strong communication and interpersonal skills, with the ability to explain technical concepts to non-technical stakeholders. Must have experience working with US public companies, with a strong understanding of security processes and how to apply them to Test Data Management (TDM) to ensure compliance with regulations. Ability to work effectively both independently and as a leader or contributor within a team. Proven ability to mentor junior team members and drive adoption of best practices. Nice to Have: Experience with specific TDM tools like, Informatica TDM, Alteryx , PyETL,Deequ, Google DVT, etc Experience with data virtualization tools. Experience with AI for Synthetic Data Generation. Experience with cloud platforms (AWS, Azure, GCP) and cloud database services. Experience integrating TDM processes into CI/CD pipelines. Familiarity with performance testing concepts and data needs. Relevant certifications in TDM, databases, or cloud technologies. Preferred Qualification: Bachelor s Degree in Computer Science, Engineering or equivalent work experience Min of 6+ Test Data Management Engineer Test Data Management Engineer - Role & Responsibilities : TDM Strategy & Design: Define, develop, and implement comprehensive Test Data Management (TDM) strategies, frameworks, and processes tailored for financial applications. Analyze complex application data models and relationships to design effective data subsetting, masking, and generation solutions using IBM Optim. Collaborate with stakeholders (QA, Development, Business Analysts, Compliance) to understand test data requirements and translate them into technical TDM solutions. TDM Solution Implementation & Management: Design, configure, and execute TDM processes using relevant TDM tools (subsetting, masking, generation, synthetic data creation, data refresh). Implement and manage data masking/obfuscation techniques to comply with data privacy regulations (e.g., GDPR, CCPA, etc.) and internal policies for sensitive financial data. Manage the lifecycle of test data environments, including planning refresh cycles, executing data provisioning requests, and managing data retention. Troubleshoot and resolve complex test data-related issues across different environments. Data Analysis & Provisioning: Perform in-depth data analysis to identify critical data elements, sensitive data, and complex data relationships required for various testing cycles (functional, performance, UAT). Provision timely and relevant test data sets to different testing environments based on project needs. Troubleshoot and resolve test data-related issues, ensuring data integrity and quality. Compliance & Security: Ensure all TDM activities adhere strictly to internal data governance policies and external financial regulations regarding data privacy and security. Work closely with Compliance, Security, and Audit teams to validate TDM processes and controls. Performance & Automation: Optimize IBM Optim processes and underlying database interactions for performance and efficiency. Identify opportunities for automation in test data provisioning and management workflows. Collaboration & Business Alignment: Establish and promote TDM best practices, standards, and guidelines across the organization. Create and maintain detailed documentation for TDM processes, tools, and environments. Work closely with Database Administrators (DBAs) to manage test data storage, performance, and access. Collaborate with DevOps engineers to integrate TDM processes into CI/CD pipelines where applicable. Collaborate closely with Product Owners, Business Analysts, Software Engineers, to understand complex financial requirements, define precise testing criteria, and prioritize automation efforts.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

About Sanofi: We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. Who You Are: You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofi s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets There are multiple vacancies across our Digital profiles and NA region. Further assessments will be completed to determine specific function and level of hired candidates. Job Highlights: Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Coach other members of data engineering teams on workflows, technical topics, pipeline management Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the company s standards, industry practices and emerging technologies Key Functional Requirements & Qualifications: Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications: Bachelor s Degree or equivalent in Computer Science, Engineering, or relevant field 6+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice to haves: Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Pursue Progress Discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn t happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let s be those people. Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi.com ! Sanofi is an equal opportunity employer committed to diversity and inclusion. Our goal is to attract, develop and retain highly talented employees from diverse backgrounds, allowing us to benefit from a wide variety of experiences and perspectives. We welcome and encourage applications from all qualified applicants. Accommodations for persons with disabilities required during the recruitment process are available upon request. Thank you in advance for your interest. Only those candidates selected for interviews will be contacted.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. You will help shape our journey in realizing increased business value from our data for our business partners. This team works on building data pipelines, integration projects, and data warehousing to facilitate business intelligence, visualization, and analytics for the Canada Life community. You will be part of a winning team that leads the way in delivering value for our business and to help accelerate Cloud adoption, building Azure Data Factory and ETL processes, writing complex SQLs, supporting operational processes in both our data warehouse and Microsoft Azure data hub. This role requires strong technical background and solid development experience in data warehousing, ETL development tools, data modeling and design, and experience in the MS Azure / Azure Data Factory / Data Bricks ecosystem. Essential Functions: Build data pipelines in MS Azure platform that supports the consumption of structured and unstructured data from various sources with the use of Azure Data Factory and Data Bricks Collaborate with cross functional teams to analyze, transform, and model large complex data sets for business insights, operational reporting, and analytics Develop data solutions that facilitates re-usability across various business functions Develop and support the creation of conceptual, logical, and physical data models Lead and support the migration of on-premise data solutions to the MS Azure platform Actively contribute and improve our data solutions and best practices Collaborate with internal technical stakeholders and business partners for solving data related technical issues Advise others on Data Modelling and the use of database technologies Enhancements to existing databases Qualifications: Bachelor s Degree in Computer science, Mathematics, Statistics or in a related field. 5+ years of hands-on experience in developing data solutions with a strong background in data warehousing (incorporating a Kimball dimensional design methodology) Strong relational database development experience in MS SQL is required. Exposure to No SQL database is a plus Experience in developing ETL constructs using Informatica or an equivalent tool Recent development experience in MS Azure platform (Databricks, ADF), Kafka, Python (or R), Change Spark, and Data Capture (CDC) Good communication and collaboration skills

Posted 2 weeks ago

Apply

10.0 - 12.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Senior Technical Specialist Education: B.E\B.Tech\MCA\B.SC Experience: 10+ Years Location: Bangalore/Hyderabad/Mumbai Key Skills: SQL, PLSQL/TSQL, Data Warehouse, SQL, Snowflake, Matillion, Python, PySpark Qualifications: 10+ years of ETL and/or Business Intelligence experience Proficient with SQL writing skills Strong Snowflake Developer with Extensive Development experience and Data Analysis required to develop a new complex data warehouse At least 3 full years of recent Snowflake development experience Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Able to administer and monitor Snowflake computing platform Hands on experience with data load and manage cloud DB Evaluate Snowflake Design considerations for any change in the application Build the Logical and Physical data model for snowflake as per the changes required Define roles, privileges required to access different database objects Define virtual warehouse sizing for Snowflake for different type of workloads Design and code required Database structures and components Build the Logical and Physical data model for snowflake as per the changes required Deploy fully operational data warehouse solutions into production on Snowflake Experience in creation and modification of user accounts and security groups per request Handling large and complex sets of XML, JSON, and CSV from various sources and databases Solid grasp of database engineering and design Experience using Matillion, Understanding of Data integration tools. Good knowledge on Cloud Computing AWS and/or Azure Experience with any scripting languages, preferably Python Experience writing code that aggregates and transforms data from multiple data sources Experience designing, building, and optimizing analytics models in support of downstream BI platforms Experience with relational databases Knowledge of GIT Source Control, CI/CD Strong technical writing/documentation skills Effective written and oral communication skills Experience with processes that extract data from different sources, transform the data into a usable and trusted resource, and load that data into the systems end-users can access and use downstream to solve business problems (ETL/ELT processes) Nice to have: Scripting with Python SnowPro Certification Experience with an ETL tool, like Informatica, Datastage, Matillion

Posted 2 weeks ago

Apply

4.0 - 6.0 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You’ll Do Take primary ownership in driving both self and team efforts across all phases of the project lifecycle, ensuring alignment with business objectives. Translate business requirements into technical specifications and lead team efforts to design, build, and manage technology solutions that effectively address business problems. Develop and apply advanced statistical models and leverage analytic techniques to utilize data for guiding decision-making for clients and internal teams. Apply appropriate development methodologies (e.g., agile, waterfall) and best practices (e.g., mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely project completion. Partner with project and program leads to deliver projects and assist in project management responsibilities, including project planning, people management, staffing, and risk mitigation. Collaborate with team members globally, ensuring seamless communication, sharing responsibilities, and undertaking tasks effectively. Manage a diverse team of skill sets (programmers, cloud analysts, BI developers, reporting, operations, etc.), mentoring and coaching junior members to enhance their skills and capabilities. Lead task planning and distribution across team members, ensuring timely completion with high quality and providing accurate status reports to senior management. Design custom analyses in programming languages (e.g., R, Python), data visualization tools (e.g., Tableau), and other analytical platforms (e.g., SAS, Visual Basic, Excel) to address client needs. Synthesize and communicate results to clients and internal teams through compelling oral and written presentations. Create project deliverables and implement solutions, while exhibiting a continuous improvement mindset and the capability to learn new technologies, business domains, and project management processes. Guide and mentor Associates within teams, fostering a collaborative environment and enhancing team performance. Demonstrate advanced problem-solving skills, ensuring the team continuously improves its capabilities and approaches to challenges. Exhibit a proactive approach to decision-making, considering the broader picture, especially regarding technical nuances and strategic planning. What You’ll Bring Education Bachelor’s or Master’s degree in Computer Science, Engineering, MIS, or related fields, with strong academic performance, especially in analytic and quantitative coursework. Experience Consulting Industry 4-6 years of relevant consulting experience, ideally in medium-to-large scale technology solution delivery projects Technical Skills 1+ year of hands-on experience in data processing solutions, data modeling, and experience with ETL technologies (e.g., Hadoop, Spark, PySpark, Informatica, Talend, SSIS). Proficiency in programming languages like Python, SQL, Java, Scala, and understanding of data structures. Experience with cloud platforms such as AWS, Azure, or GCP, and exposure to distributed computing. Deep expertise in SQL and data management best practices, with a focus on data analytics and visualization. Consulting/Project Leadership Proven experience leading project teams and managing end-to-end delivery, mentoring team members, and maintaining high standards. Ability to translate complex data and analytics concepts into accessible presentations and frameworks for both technical and non-technical stakeholders. Deep understanding of data management best practices and data analytics methodologies, ensuring high-quality data insights. Effective in a global team environment with a readiness to travel as needed. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you’ll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you'll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Are you ready to make an impact at DTCC Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: Data Quality and Integration role is a highly technical position considered a technical expert in system implementation - with an emphasis on providing design, ETL, data quality and warehouse modeling expertise. This role will be accountable for, knowledge of capital development efforts. Performs in an experienced level the technical design of application components, builds applications, interfaces between applications, and understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices & standards; gives to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; gives expertise on significant application components, vendor products, program languages, databases, operating systems, etc, completes the plan by building components, testing, configuring, tuning, and deploying solutions. Software Engineer (SE) for Data Quality and Integration applies specific technical knowledge of data quality and data integration in order to assist in the design and construction of critical systems. The SE works as part of an AD project squad and may interact with the business, Functional Architects, and domain experts on related integrating systems. The SE will contribute to the design of components or individual programs and participates fully in the construction and testing. This involves working with the Senior Application Architect, and other technical contributors at all levels. This position contributes expertise to project teams through all phases, including post-deployment support. This means researching specific technologies, and applications, and contributing to the solution design, supporting development teams, testing, troubleshooting, and production support. The ASD must possess experience in integrating large volumes of data, efficiently and in a timely manner. This position requires working closely with the functional and governance functions, and more senior technical resources, reviewing technical designs and specifications, and contributing to cost estimates and schedules. What You'll Do: Technology Expertise is a domain expert on one or more of programming languages, vendor products specifically, Informatica Data Quality and Informatica Data Integration Hub, DTCC applications, data structures, business lines. Platforms works with Infrastructure partners to stand up development, testing, and production environments Elaboration works with the Functional Architect to ensure designs satisfy functional requirements Data Modeling reviews and extends data models Data Quality Concepts Experience in Data Profiling, Scorecards, Monitoring, Matching, Cleansing Is aware of frameworks that promote concepts of isolation, extensibility, and extendibility System Performance contributes to solutions that satisfy performance requirements; constructs test cases and strategies that account for performance requirements; tunes application performance issues Security implements solutions and complete test plans working mentoring other team members in standard process Standards is aware of technology standards and understands technical solutions need to be consistent with them Documentation develops and maintains system documentation Is familiar with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Educational background and work experience that includes mathematics and conversion of expressions into run time executable code. Ensures own and teams practices support success across all geographic locations Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Helps roll out standards and policies to other team members. Financial Industry Experience including Trades, Clearing and Settlement Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Minimum of 3+ years in Data Quality and Integration. Basic understanding of Logical Data Modeling and Database design is a plus Technical experience with multiple database platformsSybase, Oracle, DB2 and distributed databases like Teradata/Greenplum/Redshift/Snowflake containing high volumes of data. Knowledge of data management processes and standard methodologies preferred Proficiency with Microsoft Office tools required Supports team in managing client expectations and resolving issues on time. Technical skills highly preferred along with strong analytical skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Mumbai

Hybrid

Naukri logo

PF Detection is mandatory Job Description: Minimum 5 years of experience in database development and ETL tools. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). Experience with data modeling and schema design. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Experience: More than 3 years in data integration, pipeline development, and data warehousing, with a strong focus on AWS Databricks. Technical Skills: Proficiency in Databricks platform, management, and optimization. Strong experience in AWS Cloud, particularly in data engineering and administration, with expertise in Apache Spark, S3, Athena, Glue, Kafka, Lambda, Redshift, and RDS. Proven experience in data engineering performance tuning and analytical understanding in business and program contexts. Solid experience in Python development, specifically in pySpark within the AWS Cloud environment, including experience with Terraform. Knowledge of databases (Oracle, SQL Server, PostgreSQL, Redshift, MySQL, or similar) and advanced database querying. Experience with source control systems (Git, Bitbucket) and Jenkins for build and continuous integration. Understanding of continuous deployment (CI/CD) processes. Experience with Airflow and additional Apache Spark knowledge is advantageous. Exposure to ETL tools, including Informatica. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 11.0 years

7 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Position : IICS Developer Experience : 6 + Years Location : Bangalore Notice Period : Immediate to 15 Days Primary Skill : IICS Developer, IICS, Informatica Intelligent Cloud Services, Informatica, ETL Best Regards, Shaheen J shaheen.jameelahmed@buzzworks.in

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Role - Informatica Customer 360 Cloud MDM Expert Client: Global Pharmaceutical Leader Engagement: 6-month contract Mode: Remote Skills: 5+ years in Data Management with 3+ years in Informatica Customer 360 Cloud, MDM, and Reference 360. Led end-to-end MDM implementations and data modeling. Strong in data governance, quality rules, and dashboards. Hands-on with API/batch integrations and performance tuning. Skilled in troubleshooting, documentation, and stakeholder communication.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Noida

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 09 S&P Global – Dow Jones Indices About the Role Software Developer - Enterprise Data Management The Team We are seeking a highly skilled Enterprise Data Management (EDM) Software Engineer to join our dynamic team. This role will focus on building, enhancing, and optimizing our enterprise data management solutions, ensuring efficient data processing, governance, and integration across multiple platforms. The ideal candidate will have a strong background in data engineering, software development, and enterprise data architecture. Responsibilities and Impact : Design, develop, and maintain robust EDM solutions to support business needs. Implement data ingestion, validation, and transformation pipelines for large-scale structured and unstructured data. Develop and optimize SQL databases for data storage, retrieval, and reporting. Ensure high data quality and compliance with regulatory and security requirements. Collaborate analysts, and business stakeholders to design scalable data solutions. Automate data workflows, monitoring, and alerting to improve system performance and resilience. Work on system integrations, including APIs, ETL/ELT processes, and cloud-based data services. Troubleshoot and resolve data-related technical issues, ensuring high availability and reliability. Stay up to date with industry trends and emerging technologies in data management and cloud computing. What We’re Looking For: Basic Required Qualifications : Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 1 to 10 years of experience in software development with a focus on enterprise data management. Strong proficiency in SQL and Python for data processing and automation. Experience with relational and NoSQL databases. Hands-on experience with ETL/ELT tools and frameworks (e.g., EDM, Apache Informatica). Familiarity with AWS, and their data services. Strong understanding of data governance, metadata management, and data security best practices . Excellent problem-solving skills, analytical mindset, and ability to work in an agile environment. Effective communication skills to collaborate with cross-functional teams. We are a global team, and the candidate should be flexible in their work hours. Additional Preferred Qualifications : Experience with data modeling, master data management (MDM), and data lineage tools. Knowledge of financial or market data processing and corporate actions is a plus. Experience working in a DevOps environment with CI/CD pipelines. About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. We’re the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 ® and the Dow Jones Industrial Average ® . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji . What’s In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Were more than 35,000 strong worldwide—so were able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all.From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the worlds leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIt’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email toEEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

7.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary Markit EDM Proficiency (Must Have): In-depth knowledge and hands-on experience with Markit EDM including development configuration customization and troubleshooting. Proficient with Data Constructors Data Porters Core Matcher Mastering UI Data Illustrators and the various Builders (Table View Stored Procedure Function and Rule). Experience managing multiple Markit EDM environments and migrating components. Responsibilities Data Management (Nice to Have): Strong understanding of data management principles and practices. Ability to design and implement data models workflows and data validation rules. Database Skills (Must Have): Proficiency in SQL and experience working with relational databases. Knowledge of database optimization techniques and performance tuning. Experience with Oracle and SQL Server databases. Programming (Nice to Have): Skills in programming languages with ETL focus commonly used in conjunction with Markit EDM such as C# and Informatica. Ability to write scripts and automate tasks as needed. Experience working in an Agile environment. Integration and API Knowledge (Nice to Have): Experience integrating Markit EDM with other systems and applications. Knowledge of APIs and data exchange protocols for seamless integration. Financial Industry Knowledge (Must Have): Familiarity with financial instruments markets and industry-specific data requirements. Ability to translate requirements into deliverables. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Quality Lead Department: Data Governance / IT Location: Pune / Bangalore Experience: 6 -8 yrs Notice period: 30 days Key Responsibilities: Lead the development and implementation of the enterprise-wide Data Quality Define and monitor key data quality metrics across various business domains. Collaborate with IT and data governance teams to establish and enforce data governance policies and frameworks. Conduct regular data quality assessments to identify gaps and areas for improvement. Implement data cleansing, validation, and enrichment processes to enhance data accuracy and reliability. Preferred Skills: Experience with tools like Informatica, Talend, Collibra, or similar. Familiarity with regulatory requirements Certification in Data Management or Data Governance. Show more Show less

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and ETL processes.- Experience with SAP HANA database and its functionalities.- Familiarity with reporting tools and techniques within the SAP ecosystem.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BW/4HANA.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction We believe that every candidate brings something special to the table, including you! So, even if you feel that you’re close but not an exact match, we encourage you to apply. We’d be thrilled to receive applications from exceptional individuals like yourself. Gallagher, a global industry leader in insurance, risk management, and consulting services, boasts a team of over 50,000 professionals worldwide. Our culture, known as "The Gallagher Way," is driven by shared values and a passion for excellence. At the heart of our global operations, the Gallagher Center of Excellence (GCoE) in India, founded in 2006, upholds the values of quality, innovation, and teamwork. With 10,000+ professionals across five India locations, GCoE is where knowledge-driven individuals make a significant impact and build rewarding, long-term careers. How You'll Make An Impact Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and transform data into insights that drive business value, through use of data analytics, data visualization and data modeling techniques. About You Strong MS PowerBI skills. Good experience in HR & Talent Data and Analytics domain. Excellent knowledge of SQL. Good MS Excel skills Data analysis, Business analysis Database management and reporting Critical-thinking and problem-solving Excellent verbal and written Communication skills Review and validate customer data as it’s collected Oversee the deployment of data to the data warehouse Cooperate with IT department to deploy software and hardware upgrades that make it possible to leverage big data use cases Monitor analytics and metrics results Good knowledge of ETL packages using Visual Studio or Informatica. Implement new data analysis methodologies Perform data profiling to identify and understand anomalies Python & R. 3+ years of relevant experience Additional Information We value inclusion and diversity Inclusion and diversity (I&D) is a core part of our business, and it’s embedded into the fabric of our organization. For more than 95 years, Gallagher has led with a commitment to sustainability and to support the communities where we live and work. Gallagher embraces our employees’ diverse identities, experiences and talents, allowing us to better serve our clients and communities. We see inclusion as a conscious commitment and diversity as a vital strength. By embracing diversity in all its forms, we live out The Gallagher Way to its fullest. Gallagher believes that all persons are entitled to equal employment opportunity and prohibits any form of discrimination by its managers, employees, vendors or customers based on race, color, religion, creed, gender (including pregnancy status), sexual orientation, gender identity (which includes transgender and other gender non-conforming individuals), gender expression, hair expression, marital status, parental status, age, national origin, ancestry, disability, medical condition, genetic information, veteran or military status, citizenship status, or any other characteristic protected (herein referred to as “protected characteristics”) by applicable federal, state, or local laws. Equal employment opportunity will be extended in all aspects of the employer-employee relationship, including, but not limited to, recruitment, hiring, training, promotion, transfer, demotion, compensation, benefits, layoff, and termination. In addition, Gallagher will make reasonable accommodations to known physical or mental limitations of an otherwise qualified person with a disability, unless the accommodation would impose an undue hardship on the operation of our business. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Value Preposition Responsible for designing and building data pipelines for enterprise data through ETL/ELT processes. Develop and maintain large-scale data platforms, data lakes and cloud solutions. Job Details Position Title: Data Engineer II Career Level: P2 Job Category: Senior Associate Role Type: Hybrid Job Location: Bengaluru About the Team: The data engineering team is community of dedicated professionals committed to designing, building, and maintaining data platform solutions for the organization. Impact (Job Summary/Why this Role Matters) Enterprise data warehouse supports several critical business functions for the bank including Regulatory Reporting, Finance, Risk steering, and Customer 360. This role is vital for building and maintaining enterprise data platform, data processes, and to support business objectives. Our values inclusivity, transparency, and excellence drive everything we do. Join us and make a meaningful impact on the organization. Key Deliverables (Duties and Responsibilities) Responsible for building and maintaining data platform that supports data integrations for Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with appropriate data access, data security, data privacy and data governance. Create data ingestion pipelines in data warehouses and other large-scale data platforms. Create Data Ingestion pipeline for a variety of sources - File (Flat, delimited, Excel), DB, API (With Apigee integration), and SharePoint. Build reusable Data pipelines / frameworks using Python. Creating scheduled as well as trigger-based ingestion patterns using scheduling tools. Create performance optimized DDLs for any row-based or columnar databases such as Oracle, Postgres, Netezza database per Logical Data Model. Performance tuning of complex data pipelines and SQL queries. Performs impact analysis of proposed changes on existing architecture, capabilities, system priorities, and technology solutions. Working in Agile Framework, participating in various agile ceremonies, co-ordination with scrum master, tech lead, and PO on sprint planning, backlog creation, refinement, demo, and retrospection. Working with Product Owners to understand PI goals, PI planning, requirement clarification, and delivery coordination. Technical support for production incidents and failures Work with global technology teams across different time zones (primarily US) to deliver timely business value. Skills and Qualification (Functional and Technical Skills) Functional Skills: 5+ years of experience, 3+ years relevant to Snowflake. Team Player: Support peers, team, and department management. Communication: Excellent verbal, written, and interpersonal communication skills. Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality. Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders Attention to Detail: Ensure accuracy and thoroughness in all tasks. Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: Expertise in design and development of complex data pipelines Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Knowledge of ELT tools such as DBT, Fivetran, and AWS Glue Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Data Model: Knowledge of Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. Performance tuning of data pipelines and DB Objects to deliver optimal performance. Experience in Gitlab version control and CI/CD processes. Experience working in Financial Industry is a plus. Relationships & Collaboration Reports to: Associate Director - Data Engineering Partners: Senior leaders and cross-functional teams Collaborates: A team of Data Engineering associates Accessibility Needs We are committed to providing an inclusive and accessible hiring process. If you require accommodations at any stage (e.g. application, interviews, onboarding) please let us know, and we will work with you to ensure a seamless experience.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

This team is responsible for Business Analytics at Seagate. About the role - you will: Responsible for SAP BODS Support and Development projects. Main tasks include the requirements analysis, conception, implementation/development of solution as per requirement. Work closely with different Cross-functional teams to develop solutions related to BODS. Architect, Develop & maintain BODS jobs. Design, develop complex dataflow and workflows. Responsible for delivering from offshore on time, on schedule, within scope and adopting industry best practice and quality. About you: Excellent verbal and written communication skills, Analytical skills. Well versed of working with offshore/onsite model. Ability to articulate and clearly communicate complex problems and solutions in a simple, logical and impactful manner. in virtual collaboration environment. Good at problem solving/team player. Your experience includes: Good development experience in SAP BODS tool. Experience in design and development of ETL dataflows and jobs. Experience on data integration from SAP and non-SAP to SAP BW4HANA and Enterprise HANA using SAP Business Objects Data Services (BODS). Good experience on delta data processing concepts. Experience on all transformations of Data Services like Map Operation, Table Comparison, Row-Generation, History Preserving, Query and SQL transformation etc. Experience on integration of non-SAP/Cloud systems with SAP BW4HANA using Data Services. Experience in SQL/PLSQL. Good to have BODS administration experience. Good to have SAP BW knowledge and experience. Knowledge on SDI/SDA/Informatica will be plus. Location: Our site in Pune is dynamic, both in our cutting-edge, innovative work, as well as our vibrant on-site food, and athletic and personal development opportunities for our employees. You can enjoy breakfast, lunch, or dinner from one of four cafeterias in the park. Take a break from your workday and participate in one of our many walkathons or compete against your colleagues in carrom, chess and table tennis. Learn about a technical topic outside your area of expertise at one of our monthly Technical Speaker Series, or attend one of the frequent on-site cultural festivals, celebrations, and community volunteer opportunities. Location : Pune, India Travel : None

Posted 2 weeks ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies