Home
Jobs

2331 Informatica Jobs - Page 36

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

3 - 7 Lacs

Salem

Work from Office

Naukri logo

- Grade Specific Skills (competencies)

Posted 1 week ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in Snowflake Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile 4+ years of experience in data warehousing, and cloud data solutions. Minimum 2+ years of hands-on experience with End-to-end Snowflake implementation. Experience in developing data architecture and roadmap strategies with knowledge to establish data governance and quality frameworks within Snowflake Expertise or strong knowledge in Snowflake best practices, performance tuning, and query optimisation. Experience with cloud platforms like AWS or Azure and familiarity with Snowflakes integration with these environments. Strong knowledge in at least one cloud (AWS or Azure) is mandatory Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 week ago

Apply

8.0 - 13.0 years

0 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hands on Informatica power center or IICS Worked on Various basic and advanced transformation in informatica and multiple source systems Having experience on source to target mapping documents creation Get the requirement from stake holders and convert the same into Detail design documents Ability to debug the issues in informatica IICS and power centre worked on API and rest v2 connectors hands on experience in Oracle/SQL Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary We are seeking an experienced Integration Developer with hands-on experience to join our development team. In this role you will be responsible for designing developing and maintaining integration solutions that connect various systems and applications within the organization. You will work with a range of technologies to ensure seamless data exchange workflow automation and overall system interoperability. Responsibilities The ideal candidate will have a solid understanding of integration patterns API development and cloud-based solutions and will be able to collaborate effectively with cross-functional teams. Design and develop robust integration solutions using middleware technologies APIs and services to ensure smooth communication between various internal and external systems. Create and maintain data flows transformations and mappings between different platforms databases and applications. Leverage enterprise integration patterns and best practices to implement scalable secure and high-performing integrations. Design and implement RESTful APIs web services and microservices for integrating applications across various platforms. Work with API gateways and manage API lifecycle from design to deployment ensuring security versioning and performance optimization. Integrate third-party APIs and services into existing systems ensuring seamless functionality and data exchange. Ensure data synchronization and integration between disparate systems (CRM ERP HRMS cloud platforms etc.) while maintaining data consistency and quality. Design develop and implement ETL (Extract Transform Load) processes for efficient data migration and integration between systems. Troubleshoot and resolve integration issues ensuring minimal downtime and impact to business operations. Work closely with business analysts project managers and other developers to gather requirements and deliver integration solutions aligned with business needs. Collaborate with infrastructure and cloud teams to design and implement scalable integration solutions leveraging cloud platforms (e.g. AWS Azure GCP). Coordinate with QA teams to ensure thorough testing of integration components ensuring they meet performance security and functional requirements. Contribute to the development of integration best practices and guidelines to ensure consistent high-quality solutions across the organization. Act as a key point of contact for troubleshooting integration issues providing timely resolution and post-mortem analysis for recurring problems. Provide support to operational teams for maintaining the health of integration solutions in production environments. Skills Experience in developing and implementing integration solutions using middleware technologies such as MuleSoft Dell Boomi Apache Camel or IBM Integration Bus (IIB). Strong experience with RESTful APIs SOAP Web Services and microservices. Proficiency in integration technologies like JMS Kafka and RabbitMQ for message-driven integrations. Experience with ETL tools (e.g. Talend Informatica) for data transformation and loading. Strong knowledge of SQL and experience working with relational and NoSQL databases (e.g. MySQL PostgreSQL MongoDB). Familiarity with cloud-based integration solutions (AWS Azure GCP) API management platforms (e.g. Apigee Kong or AWS API Gateway) and containerization technologies (e.g. Docker Kubernetes). Skills in programming languages such as Java Python JavaScript or similar for building integration solutions. Experience with scripting languages for automating integration tasks (e.g. Bash PowerShell Python). Experience in debugging integration problems identifying root causes and implementing corrective actions. Excellent verbal and written communication skills with the ability to document integration solutions processes and guidelines clearly. Good to have Telecom domain knowledge. Show more Show less

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

About This Role At BlackRock, we are looking for a Data Engineer who enjoys building and supporting high impact data pipelines to solve complex challenges while working closely with your colleagues throughout the business. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, drive, and passion while giving you the opportunity to grow technically while learning from hands-on leaders in technology and finance. With over USD $11 trillion of assets we have an outstanding responsibility: our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing. Being a financial technologist at BlackRock means you get the best of both worlds: working for one of the most successful financial companies and also working in a software development team responsible for next generation technology and solutions. We are seeking a high-reaching individual to help implement financial data engineering projects, initially focusing on our Index Fixed Income Group for the BGM DnA ("Data and Analytics") team in India. We are a community of highly qualified Data Engineers, Content & DevOps Specialists who have a passion for working on data solutions that help drive the agenda for our business partners Our team is based in San Francisco, London & Hungary, and we will complete the global circle with a new engineering team in Mumbai. About BlackRock Global Markets BlackRock Global Markets (“BGM”) functions are at the core of BlackRock’s markets and investments platform, including ETF and Index Investments (“Engine”), Global Trading, Securities Lending, Fixed Income, Liquidity and Financing. BGM is passionate about advancing the investment processes and platform architecture in these areas and on ensuring we engage with other market participants in a collaborative, strategic way. You should be Someone who is passionate about solving sophisticated business problems through data! Capable of the design, implementation, and optimization of data pipelines, ETL processes, and data storage solutions Able to work closely with multi-functional teams (e.g., Data Science, Product, Analytics, and Citizen Developer teams) to ensure the data infrastructure meets business needs. Enthusiastic about establishing and maintaining standard methodologies for data engineering, focusing on data quality, security, and scalability. Key Requirements 3-6 years Data Engineering experience preferably in the financial sector Familiarity with any aspect of Fixed Income Index and Market Data including ICE, Bloomberg, JP Morgan, FTSE/Russell, and IBOXX. Liquidity, Venue, and Direct Broker Dealer Market Maker Axe Data. Pricing Data from sources like S&P Global Live Bond Pricing or Bloombergs IBVAL. Understand Portfolio Management Fundamentals: Asset Management and FI Trading. A passion for Financial and Capital Markets. Proven experience working in an agile development team. Strong problem solving skills. Strong SQL and Python skills with a proven track record optimizing SQL queries. Curiosity of financial markets. Good To Have Bachelor’s degree in Computer Science, Engineering, Finance, Economics, or a related field. A Master’s degree or equivalent experience is a plus. Knowledge of Linux and scripting languages such as Bash Experience with MySQL, PostgreSQL, Greenplum, Snowflake or similar databases. Strong experience with ETL/ELT tools like DBT, Pentaho, Informatica or similar technologies. Experience with DevOps and tools like Azure DevOps Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less

Posted 1 week ago

Apply

4.0 - 9.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are organizing a direct walk-in drive at Bengaluru location. Please find below details and skills for which we have a walk-in at TCS - Bengaluru on 14th June 2025 Experience: 4 - 8 years Skill Name :- (1) SharePoint Developer (2) SPFX (3) PowerPlatform (4) AWS Data Engineer (5) AWS Devops (6) Azure Dot Net Fullstack Developer (7) Azure Data Engineer (8) Azure devops (9) Data Analyst (10) Node JS (11) Informatica Developer (12) Java Springboot / Microservice

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Process Manager - GCP Data Engineer Mumbai/Pune | Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)| Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles And Responsibilities Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store, organize, and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical And Functional Skills Bachelor’s Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About EClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About About eClerx Technology eClerx’s Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists’ partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law Show more Show less

Posted 1 week ago

Apply

2.0 - 5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: 2.0 - 5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less

Posted 1 week ago

Apply

2.0 - 5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: 2.0 - 5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary We are seeking a skilled and proactive Data Engineer with a strong background in ETL development and a focus on integrating data quality frameworks. In this role, you will be responsible for designing, developing, and maintaining ETL pipelines while ensuring data quality is embedded throughout the process. You will play a crucial role in building robust and reliable data pipelines that deliver high-quality data to our data warehouse and other systems. Responsibilities Design, develop, and implement ETL processes to extract data from various source systems, transform it according to business requirements, and load it into target systems (e.g., data warehouse, data lake) Implement data validation and error handling within ETL pipelines. Build and maintain scalable, reliable, and efficient data pipelines. Design and implement data quality checks, validations, and transformations within ETL processes. Automate data quality monitoring, alerting, and reporting within ETL pipelines. Develop and implement data quality rules and standards within ETL processes. Integrate data from diverse sources, including databases, APIs, flat files, and cloud-based systems. Utilize ETL tools and technologies (e.g., SnapLogic, Informatica PowerCenter, Talend, AWS Glue, Apache Airflow, Azure Data Factory, etc.). Write SQL queries to extract, transform, load, and validate data. Use scripting languages (e.g., Python) to automate ETL processes, data quality checks, and data transformations. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Ivy is a global, cutting-edge software and support services provider, partnering with one of the world’s biggest online gaming and entertainment groups. Founded in 2001, we’ve grown from a small tech company in Hyderabad to one creating innovative software solutions used by millions of consumers around the world, with billions of transactions taking place to head even some of the biggest technology giants. Focused on quality at scale, we deliver excellence to our customers day in and day out, with everyone working together to make what sometimes feels impossible, possible. This means that not only do you get to work for a dynamic organization delivering pioneering technology, gaming and business solutions, you can also have an exciting and entertaining career. At Ivy, Bright Minds Shine Brighter. Job Description As a Senior Data Engineer, you'll design, develop, deploy, and maintain software features in a specialized technical domain (Back end/Front end). Reporting to the Tech Lead, you'll be part of the Product & Technology Team, focused on creating and enhancing software components of moderate complexity Are you ready to be a part of our journey delivering excellence and collaborating with one of the world's biggest online gaming and entertainment groups? What You Will Do Oversee Data Warehousing and Data Integration projects (4-8 years of experience required). Utilize IDMC ETL tools, including installation, configuration, and administration on Linux (IICS). Manage users, groups, roles, and privileges; handle capacity increases and health monitoring. Develop and refine dashboards for capacity planning and platform resilience. Optimize and tune ETL/ELT jobs and Linux (Red Hat) applications. Administer data project life cycles, enforce ETL/ELT standards, and best practices around quality & profiling Design ETL mappings using Informatica suite. Experience with Snowflake and data quality (DQ) is a plus; integration of IICS/ETL with Snowflake preferred. Manage migration of on-premises data platforms to the cloud. DBT knowledge required Experience with AWS cloud environment. Work with vendors on product road map and upgrades. Qualifications Strong experience in data integration with on-premise and cloud databases (e.g., Snowflake on AWS/Azure). Proficient in Snowflake Expert in Python, SQL & shell Scripting Skilled in Linux-based IDMC administration (IICS). Expertise in data project lifecycle management, including ETL/ELT standards and best practices. Experience in performance optimization and troubleshooting ETL processes. Capable of maintaining comprehensive ETL documentation and collaborating with cross-functional teams. Knowledgeable in security and compliance standards, including GDPR regulations. Experience in real-time batch and ETL data conversions Having Knowledge on CI/CD processes and Automated deployment process. Experience on UAT, Migration and Go-Live Support Additional Information At Ivy, we know that signing top players requires a great starting package, and plenty of support to inspire peak performance. Join us, and a competitive salary is just the beginning. Depending on your role and location, you can expect to receive benefits like: Safe home pickup and home drop (Hyderabad Office Only) Group Mediclaim policy Group Critical Illness policy Communication & Relocation allowance Annual Health check And outside of this, you’ll have the chance to turn recognition from leaders and colleagues into amazing prizes. Join a winning team of talented people and be a part of an inclusive and supporting community where everyone is celebrated for being themselves. Should you need any adjustments or accommodations to the recruitment process, at either application or interview, please contact us. At ivy, we do what’s right. It’s one of our core values and that’s why we're taking the lead when it comes to creating a diverse, equitable and inclusive future - for our people, and the wider global sports betting and gaming sector. However you identify, across any protected characteristic, our ambition is to ensure our people across the globe feel valued, respected and their individuality celebrated. We comply with all applicable recruitment regulations and employment laws in the jurisdictions where we operate, ensuring ethical and compliant hiring practices globally. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Senior Engineer, Database Engineering Hyderabad, India; Gurgaon, India Information Technology 316329 Job Description About The Role: Grade Level (for internal use): 10 What's in for you : As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team : Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Responsibilities: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# has a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316329 Posted On: 2025-05-27 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

4.0 - 8.0 years

2 - 2 Lacs

Hyderābād

On-site

GlassDoor logo

Company Description Ivy is a global, cutting-edge software and support services provider, partnering with one of the world’s biggest online gaming and entertainment groups. Founded in 2001, we’ve grown from a small tech company in Hyderabad to one creating innovative software solutions used by millions of consumers around the world, with billions of transactions taking place to head even some of the biggest technology giants. Focused on quality at scale, we deliver excellence to our customers day in and day out, with everyone working together to make what sometimes feels impossible, possible. This means that not only do you get to work for a dynamic organization delivering pioneering technology, gaming and business solutions, you can also have an exciting and entertaining career. At Ivy, Bright Minds Shine Brighter. Job Description As a Senior Data Engineer, you'll design, develop, deploy, and maintain software features in a specialized technical domain (Back end/Front end). Reporting to the Tech Lead, you'll be part of the Product & Technology Team, focused on creating and enhancing software components of moderate complexity Are you ready to be a part of our journey delivering excellence and collaborating with one of the world's biggest online gaming and entertainment groups? What you will do Oversee Data Warehousing and Data Integration projects (4-8 years of experience required). Utilize IDMC ETL tools, including installation, configuration, and administration on Linux (IICS). Manage users, groups, roles, and privileges; handle capacity increases and health monitoring. Develop and refine dashboards for capacity planning and platform resilience. Optimize and tune ETL/ELT jobs and Linux (Red Hat) applications. Administer data project life cycles, enforce ETL/ELT standards, and best practices around quality & profiling Design ETL mappings using Informatica suite. Experience with Snowflake and data quality (DQ) is a plus; integration of IICS/ETL with Snowflake preferred. Manage migration of on-premises data platforms to the cloud. DBT knowledge required Experience with AWS cloud environment. Work with vendors on product road map and upgrades. Qualifications Strong experience in data integration with on-premise and cloud databases (e.g., Snowflake on AWS/Azure). Proficient in Snowflake Expert in Python, SQL & shell Scripting Skilled in Linux-based IDMC administration (IICS). Expertise in data project lifecycle management, including ETL/ELT standards and best practices. Experience in performance optimization and troubleshooting ETL processes. Capable of maintaining comprehensive ETL documentation and collaborating with cross-functional teams. Knowledgeable in security and compliance standards, including GDPR regulations. Experience in real-time batch and ETL data conversions Having Knowledge on CI/CD processes and Automated deployment process. Experience on UAT, Migration and Go-Live Support Additional Information At Ivy, we know that signing top players requires a great starting package, and plenty of support to inspire peak performance. Join us, and a competitive salary is just the beginning. Depending on your role and location, you can expect to receive benefits like: Safe home pickup and home drop (Hyderabad Office Only) Group Mediclaim policy Group Critical Illness policy Communication & Relocation allowance Annual Health check And outside of this, you’ll have the chance to turn recognition from leaders and colleagues into amazing prizes. Join a winning team of talented people and be a part of an inclusive and supporting community where everyone is celebrated for being themselves. Should you need any adjustments or accommodations to the recruitment process, at either application or interview, please contact us. At ivy, we do what’s right. It’s one of our core values and that’s why we're taking the lead when it comes to creating a diverse, equitable and inclusive future - for our people, and the wider global sports betting and gaming sector. However you identify, across any protected characteristic, our ambition is to ensure our people across the globe feel valued, respected and their individuality celebrated. We comply with all applicable recruitment regulations and employment laws in the jurisdictions where we operate, ensuring ethical and compliant hiring practices globally.

Posted 1 week ago

Apply

5.0 years

7 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Database Engineer III Hyderabad, India Information Technology 313379 Job Description About The Role: Grade Level (for internal use): 10 The Team: You will be part of a World Class Database Reliability Engineering (DBRE)/DSE team supporting various product offerings of S&P Global Market Intelligence powered by various middle tier, database and OS technologies and used by tens of thousands of users globally. The team is responsible for ensuring Greater Reliability, High Performance, High Availability and Scalability. You will work with a global team of intelligent and ambitious professionals recruited from top industries. The team is responsible for the database support, administration, scalability, performance tuning, release and architecture of large-scale data processing, aggregation and retrieval systems. The Impact: Delivering an Extensive View of Credit Risk across Rated and Unrated Private and Public Companies around the Globe for esteemed customers (Credit Analysts, Portfolio Managers, Commercial Lenders, Risk Managers, Insurance Underwriters, Regulatory Professionals) Continually enhanced platform content combined with upload capabilities for proprietary data to deliver the coverage Customer’s need. Analytic models (CreditPro, IPREO) that deliver universally comparable credit benchmarks for comparison across rated and unrated public and private companies. Robust workflow solutions that offer speed and efficiency, enabling Customers to track growing number of global exposures. Work on issues where analysis of situation or data requires review of relevant factors. What’s in it for you: This is the place to hone your existing database and leadership skills while having the chance to become exposed to fresh and divergent technologies (Oracle Exadata, Oracle RAC, Golden Gate, Dataguard, ZDS Appliances, WebLogic, Sonic JMS, MySQL, Informatica, MS SQL Server Product Family, NoSQL, SOLR, Vertica, active-active Databases, and petabyte storage levels). As a critical member of the team, you will have the opportunity collaborate with a global team of database administrators and engineers who have recently graduated as well as other team members who are experts in their domain. Responsibilities: As a Database Engineer III, you will Support, Administer, Manage, and Tune multi-terabyte Database Systems that will enable S&P Global Market Intelligence to ingest, manage, process and serve-up Market Data Products (CreditPro, IPREO) as well as our highly transactional and dynamic web based CIQ Pro Platform & Risk Solutions Portal. In-depth knowledge of database administration/management processes and strategies. Manage key database releases for Dev, QA, Alpha, Production and DR as part of the overall SDLC. Maintain database including patch applications and enhancements/upgrades of environments. Automate\Review routine maintenance tasks using scripting technologies Solve, non-routine problems and determine the best solution through cost/benefit analysis. Work with other teams with in operations and peer groups like Development and QA as a lead and/or member. Work with IT Service Management and other groups and make sure that all events, incidents and problems are resolved as per the SLA. Improve efficiency and quality of delivery by automating routine tasks. Contribute to best of breed product support tools including run books, monitoring, and knowledge bases to aid in the product support process. Suggest improvements to current processes and tools we use and also suggest implementation ideas. Willing work on 24/7 shifts Perform database backups/ refreshes in non-prod environments. Participate 24x7 on call tier 3 support on a weekly rotating schedule Contribute to process and troubleshooting documentation. Should be able to support virtualized environment What we are looking for: We are looking for the kind of person who is not only a roll up the sleeves hard core database technologist, but also a passionate database team member who can lead by example and motivate their peers to excel. Basic Qualifications: Bachelor's degree in Computer Science, Information Systems or Engineering 5+ years of hands-on database administration, support and performance tuning experience Experience in a start-up or Agile or Dev Ops environment is a plus. ITIL V3 Foundation Certified is a plus. AWS/AZURE/GCP Certified is a plus. Preferred Qualifications: Good Oracle hands on experience (experience with additional data platforms like SOLR, Cassandra, Elastic Search, MS SQL Server and PostgreSQL is a huge plus). Good experience with database support, administration, performance tuning, and database monitoring. Moderate Experience with Golden Gate, Attunity, NIFI, RAC, EXADATA, ZDS (RMAN) appliance and Data Guard is a must. Good knowledge on PL\SQL, Shell Programming or any other scripting or program language is a must. Knowledge on Informatica is a plus. Excellent communication skills, with strong verbal and writing proficiencies. Ability to troubleshoot problems involving Client Libraries (like JDBC), Web Logic, Operating Systems, Network and Storage is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 313379 Posted On: 2025-04-02 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 09 S&P Global – Corporate About the Role : Software Developer II - Oracle EPM The Team : Join the Corporate Finance IT EPM Team, responsible for developing and managing Oracle Enterprise Performance Management (EPM) applications. Our work supports Financial Reporting, Revenue, Corporate, Statutory, and Tax reporting, as well as Master Data management (EDMCS), Consolidations (FCCS), Reconciliations (ARCS), and Financial Close processes in a techno-functional project environment. Responsibilities and Impact : You will serve as an Administrator for the Oracle EPM suite working closely with the EPM development team to enhance system processes and the user experience. This role is essential for overseeing accounting period close and consolidation processes, ensuring compliance with SOX policies and procedures. Your expertise in reporting, reconciliation, and audit requests will support our global finance operations effectively. Administer the EPM Production environment, assisting global users with financial analysis. Primary Admin on Oracle EPM Financial Consolidation and Close Cloud Service (FCCS) application. Manage data load schedules from ERP and ensure data integrity through rigorous reconciliation processes Manage the Estimate data flows from Anaplan (Estimating/Budgeting System) to EPM via Informatica Support the categorization, data mapping, and governance for financial account requests, controlling reporting structure changes Conduct UAT testing and approvals for system enhancements Collaborate with internal and external partners to enhance system stability, performance, and functionality Utilize cutting-edge technologies and automation initiatives to enhance system functionality Provide ad-hoc support for timely closure of accounting books and resolve issues efficiently Maintain thorough documentation and work on process enhancements, incorporating automation tools where applicable Maintain data security access in all EPM pods and Anaplan models What We’re Looking For Basic Required Qualifications: Certified Chartered Accountant or Cost Accountant degree or equivalent preferred. Over 3 years of experience in finance and accounting operations, including record-to-report functions. Proficiency in reporting tools and experience with Oracle EPM systems or equivalent. Preferred to have experience with Oracle Enterprise Performance Management (EPM) system or HFM application or equivalent. Strong communication skills for collaboration across teams and management. Ability to manage workload efficiently, meet deadlines, and adapt to changing priorities. Experience in cloud platform transitions and system integration. Assertive problem-solving skills and the ability to work independently. Knowledge of all Microsoft Office Products, specifically Outlook, Excel, and Word. Must be able to work independently, be accountable for processes/tasks performed, and understand when to escalate issues to management. Flexible to work in shifting schedules, primarily to match extended US working hours (EST time zone), and render overtime when there is a strong business need, such as monthly closing of financial books or preparation of financial or reporting statements. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315305 Posted On: 2025-05-15 Location: Hyderabad, Telangana, India Show more Show less

Posted 1 week ago

Apply

0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Date: 6 Jun 2025 Company: Qualitest Group Country/Region: IN Expected Core Skills: - Extensive experience working on IICS Cloud Data Integration, Cloud Application Integration - Experience with Informatica PowerCenter- Experience with Snowflake - Proficient with SQL, PL/SQL and Unix Scripting - Experience with cloud platforms such as AWS or AzureNice to Have: - Good Communication - Ability to understand complex data models - Ability to work with multiple stake holders to understand the bigger scope of the program and contribute towards its milestone.3 Must Haves: - Experience as Informatica Powercenter 4/5 - Expertise in IICS (API & ICS Module) 4/5 - Azure 3/5

Posted 1 week ago

Apply

5.0 years

5 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26145 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility o Job Title – Denodo Developer o No. of Open Positions - 1 o Experience- 5- 9 years o Location: Bangalore, Noida, Chennai, Mumba, Hyderabad, Pune o Shift Time - CET (12:30 to 9:30 IST) Job Description: We are seeking a highly skilled and experienced Denodo Developer with a strong background in ETL processes and deep knowledge of the Life Sciences domain. The ideal candidate will be responsible for developing data virtualization solutions, integrating complex datasets from multiple sources, and enabling real-time data access for analytics and operational reporting. This role requires close collaboration with data architects, data engineers, and business stakeholders in a regulated environment. Key Proficiency & Responsibilities: Design, develop, and optimize data virtualization solutions using Denodo Platform. Integrate structured and unstructured data sources into Denodo views and services. Develop custom views, VQL scripts, and data services (REST/SOAP). Build and optimize ETL/ELT pipelines to support data ingestion and transformation. Work closely with Life Sciences business teams to translate domain-specific requirements into data solutions. Implement data governance, security, and compliance practices adhering to GxP and FDA regulations. Provide support for data access, lineage, metadata management, and user training. Collaborate with cross-functional teams in an Agile development environment. Optimize workflows for performance and scalability. Develop and maintain data documentation, including workflow descriptions and data dictionaries. Strong knowledge of data preparation, ETL concepts, and data warehousing. Excellent analytical, problem-solving, and communication skills. Proficient in VQL, JDBC, ODBC, and web services integration. Strong expertise in ETL tools (e.g., Informatica, Talend, DataStage, or Azure Data Factory). Deep understanding of Life Sciences domain – clinical trials, regulatory data, pharmacovigilance, or research & development. Preferred Qualifications: B.Tech. or MCA from a recognized University Minimum 5+ years of relevant experience as a Denodo Developer. Strong SQL and database skills (Oracle, SQL Server, PostgreSQL, etc.). Knowledge of data modelling, data warehousing, and virtual data layers. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience working in Agile/Scrum environments. Exposure to cloud platforms such as AWS, Azure, or GCP.

Posted 1 week ago

Apply

2.0 years

6 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

Job Description Summary Responsible for developing, testing and implementing data engineering solutions to generate analytical and reporting solutions. Responsible for analyzing and preparing the data needed for data science based outcomes. Also responsible for managing and maintaining metadata data structures besides providing necessary support for post-deployment related activities when needed. Job Description Company Overview Working at GE Aerospace means you are bringing your unique perspective, innovative spirit, drive, and curiosity to a collaborative and diverse team working to advance aerospace for future generations. If you have ideas, we will listen. Join us and see your ideas take flight! Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is our multidisciplinary research and engineering center. Engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Roles Overview: In this role, you will: Leverage technical data dictionaries and business glossaries to analyze the datasets Perform data profiling and data analysis for any source systems and the target data repositories Understand metadata and the underlying data structures needed to standardize the data load processes. Develop data mapping specifications based on the results of data analysis and functional requirements Perform a variety of data loads & data transformations using multiple tools and technologies. Build automated Extract, Transform & Load (ETL) jobs based on data mapping specifications Validate the data mapping results and match with the expected results Implement Data Quality (DQ) rules provided Ideal Candidate: Should have experience in data loads & data transformations using multiple tools and technologies. Required Qualification For roles outside USA: Bachelor's Degree in with basic experience. For roles in USA:Bachelor's Degree in with minimum years of experience2years Desired CharacteristicsTechnical Expertise: Creating & updating our Standard Work documentation, from operations, to PBR migrations, to how we use GitHub Budling / re-building clusters as part of our monthly build process Create Python scripts to enable future automation [ex: Python script for creating a S3 bucket] Assisting with our ongoing pursuing of remediating critical vulnerabilities as part of our EVM obligations Ability to understand logical and physical data models, big data storage architecture, data modeling methodologies, metadata management, master data management & data lineage techniques Hands-on experience in programming languages like Java, Python or Scala Hands-on experience in writing SQL scripts for Oracle, MySQL, PostgreSQL or Hive Experience in handling both Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) data models Experience with Big Data / Hadoop / Spark / Hive / NoSQL database engines (i.e. Cassandra or HBase) Exposure to unstructured datasets and ability to handle XML, JSON file formats Exposure to Extract, Transform & Load (ETL) tools like Informatica or Talend Domain Expertise: Exposure to handling machine or sensor datasets from industrial businesses Knowledge of for industrial applications in a commercial/finance/industrial/manufacturing settings. Exposure to finance and accounting data domains Leadership skills: Partner with other team members to understand the project objectives and resolve technical issues. Communicate project status or challenges in a clear and concise manner to the cross team members. Desired Qualification: Humble: respectful, receptive, agile, eager to learn Transparent: shares critical information, speaks with candor, contributes constructively Focused: quick learner, strategically prioritizes work, committed Leadership ability: strong communicator, decision-maker, collaborative Problem solver: analytical-minded, challenges existing processes, critical thinker At GE Aerospace, we have a relentless dedication to the future of safe and more sustainable flight and believe in our talented people to make it happen. Here, you will have the opportunity to work on really cool things with really smart and collaborative people. Together, we will mobilize a new era of growth in aerospace and defense. Where others stop, we accelerate. Additional Information Relocation Assistance Provided: No

Posted 1 week ago

Apply

7.0 years

3 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Job ID: 28021 Location: Bangalore, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 28 May 2025 Job Summary Responsible for the design and development analytical data model, reports and dashboards Provide design specifications to both onsite & offshore teams Develop functional specifications and document the requirements clearly Develop ETL, Reports, data mapping documents and technical design specifications for the project deliverables. Analyse the business requirements and come up with the end-to-end design including technical implementation. Key Responsibilities Strategy Responsible for the design and development analytical data model, reports and dashboards Provide design specifications to both onsite & offshore teams Develop functional specifications and document the requirements clearly Develop data mapping documents and technical design specifications for the project deliverables. Analyse the business requirements and come up with the end-to-end design including technical implementation. Expert in Power BI, MSTR, Informatica, Hadoop Platform Ecosystem, SQL, Java, R, Python, Java Script, Hive, Spark, Linux Scripts Good knowledge on to Install, upgrade, administration, and troubleshooting of ETL & Reporting tools such as Power BI, MSTR, Informatica, Oracle and Hadoop Implement performance tuning techniques for Reports, ETL and data Migration Develop ETL procedures to ensure conformity, compliance with standards and translate business rules and functionality requirements into ETL procedures. Assess and review the report performance, come up with performance optimization techniques using VLDB settings and explain plan. Develop scripts to automate the production deployments. Conduct product demonstrations and user training sessions to business users Work with testing teams to improve the quality of the testing in adopting the automated testing tools and management of the application environments. Business Collaborate and partner with product owner, business users and senior business stakeholders to understand the data and reporting requirements of the business and clearly document it for further analysis Work closely with architects and infrastructure teams and review the solutions Interact with support teams periodically and get input on the various business users’ needs Provide all the required input and assistance to the business users in performing the data validation and ensure that the data and reporting delivered with accurate numbers. Processes Process oriented and experienced in onsite-offshore project delivery, using agile methodology and best practices Well versed with agile based project delivery methodology. Should have successfully implemented or delivered projects using best practices on technology delivery and project release automation in banking and financing industry Deployment automation for Oracle Databse and Hadoop , Informatica workflow WorkFlow, Integration, BI layer including MicroStrategy and PBI components to the feasible extent Actively participate in discussions with business users and seek endorsements and approvals wherever necessary w.r.t technology project delivery. People & Talent Minimum 7 years of experience in the business intelligence and data warehouse domain. Create project estimations, solution and design documentation, operational guidelines and production handover documentation Should have excellent technical, analytical, interpersonal and delivery capabilities in the areas of complex reporting for banking domain, especially in the area of Client Analytics and CRM. Full Life-cycle Business Intelligence (BI) and Data Warehousing project experience, starting with requirements analysis, proof-of-concepts, design, development, testing, deployment and administration Shall be a good team player with excellent written and verbal communications. Process oriented and experienced in onsite-offshore project delivery, using agile methodology and best practices Should be able play an Individual Contributor role Risk Management Assess and evaluate the risks that are related to the project delivery and update the stakeholders with appropriate remediation and mitigation approach. Review the technical solutions and deliverables with architects and key technology stakeholders and ensure that the deliverables are adhering to the risk governance rules Governance Work with Technology Governance and support teams and establish standards for simplifying the existing Microstrategy reports and Informatica batch programs Take end to end ownership of managing and administering the Informatica & Hadoop , MSTR and Power BI. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Fill in for regulated roles] Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * [Insert local regulator e.g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Business and Operations (Product Owners) Sales Enablement Client Coverage Reporting Technology Services Teams Production Support Teams Skills and Experience Design, development of ETL procedures using Informatica PowerCenter Performance tuning of star schemas to optimize load and query performance of SQL queries. Hive, HiveQL, HDFS, Scala, Spark, Sqoop, HBase, YARN, Presto , Dremio Experience in Oracle 11g, 19c. Strong knowledge and understanding of SQL and ability to write SQL, PL/SQL BI and analytical dashboards , reporting design and development using PBI tools and MicroStrategy Business Intelligence product suite (MicroStrategy Intelligence Server, MicroStrategy Desktop, MicroStrategy Web, MicroStrategy Architect, MicroStrategy Object Manager, MicroStrategy Command Manager, MicroStrategy Integrity Manager, MicroStrategy Office, Visual Insight, Mobile Development Design of dimensional modelling like star and snowflake schema Setting up connections to Hadoop big data (data lake) cluster through Kerberos authentication mechanisms Banking and Finance specific to financial market, Collateral, Trade Life Cycle , Operational CRM, Analytical CRM and client related reporting Design and implementation of Azure Data Solution and Microsoft Azure Cloud Qualifications About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Bengaluru, Karnataka Job ID 30181703 Job Category Digital Technology Country: India Location: Ecospace Campus 3A, 4th Floor, Outer Ring Road, Bellandur, Bengaluru- 560103 Job Title - AI Data Analyst Preferred Location - Bangalore Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are looking for a skilled Data Analyst with over 5 years of experience to join our team. The ideal candidate will have a strong background in data management principles, data engineering, and modeling, with hands-on expertise in building data models and data catalogs. This role requires knowledge of data marketplaces, data products, and Agile frameworks, along with a solid understanding of a specific business process (e.g., Supply Chain, Finance, Operations). The Data Analyst will play a key role in transforming raw data into actionable insights, ensuring data quality, and supporting data-driven decision-making across the organization. Key Responsibilities Analyze and interpret complex datasets to provide actionable insights that support business objectives in areas such as Supply Chain, Finance, or Operations. data management principles, including data governance, data quality, and metadata management, to ensure reliable and trustworthy data assets. Collaborate with data engineering teams to design and optimize ETL processes, ensuring efficient data extraction, transformation, and loading. Build and maintain data models using industry-standard data modeling tools to support reporting, analytics, and data integration needs. Develop and manage data catalogs using data catalog tools to enhance data discoverability, lineage, and governance. Contribute to the design and implementation of data marketplaces and data products, enabling self-service access to high-quality datasets. Work within an Agile framework, participating in sprint planning, stand-ups, and retrospectives to deliver iterative data solutions. Partner with business stakeholders to understand specific processes (e.g., Supply Chain, Finance, Operations) and translate requirements into data-driven solutions. Monitor data quality, identify anomalies, and recommend improvements to enhance data integrity and usability. Create visualizations, reports, and dashboards to communicate findings effectively to technical and non-technical audiences. Required Skills and Qualifications Bachelor’s degree in Data Science, Computer Science, Statistics, Business Analytics, or a related field. 5+ years of experience as a Data Analyst or in a similar role, with a proven track record of delivering data solutions. Strong experience in data management principles, including data governance, data quality, and metadata management. Proficiency in data engineering concepts, such as ETL processes and data analysis, with hands-on experience in related tools (e.g., SQL, Python, Informatica). Demonstrated expertise in building data models using data modeling tools (e.g., ER/Studio, dbt, PowerDesigner). Practical experience in developing and maintaining data catalogs with tools like Atlan, Collibra, or Informatica Data Catalog. Knowledge of data marketplaces and data products, with an understanding of their role in enabling data accessibility and value creation. Experience working in an Agile framework, delivering incremental value through iterative development cycles. Good understanding of at least one specific business process (e.g., Supply Chain, Finance, Operations) and its data requirements. Strong analytical skills, with the ability to work with large datasets and derive meaningful insights. Excellent communication skills to collaborate with cross-functional teams and present findings to diverse stakeholders. Preferred Qualifications Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Familiarity with BI tools like Tableau, Power BI, or Looker for data visualization. Exposure to advanced analytics techniques, such as predictive modeling or machine learning. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Description IntegriChain is the data and application backbone for market access departments of Life Sciences manufacturers. We deliver the data, the applications, and the business process infrastructure for patient access and therapy commercialization. More than 250 manufacturers rely on our ICyte Platform to orchestrate their commercial and government payer contracting, patient services, and distribution channels. ICyte is the first and only platform that unites the financial, operational, and commercial data sets required to support therapy access in the era of specialty and precision medicine. With ICyte, Life Sciences innovators can digitalize their market access operations, freeing up resources to focus on more data-driven decision support. With ICyte, Life Sciences innovators are digitalizing labor-intensive processes – freeing up their best talent to identify and resolve coverage and availability hurdles and to manage pricing and forecasting complexity. We are headquartered in Philadelphia, PA (USA), with offices in: Ambler, PA (USA); Pune, India; and Medellín, Colombia. For more information, visit www.integrichain.com, or follow us on Twitter @IntegriChain and LinkedIn. Job Description The MDM Configuration Expert will be responsible for developing and optimizing MDM configuration, ensuring data quality, consistency, and reliability of the mastered data. Key Responsibilities Perform in-depth data profiling, cleansing, and validation to ensure high-quality master data Design and implement MDM configurations using Reltio or Informatica MDM platforms Develop and maintain data models that support commercial business processes in the life sciences industry Design data integration strategies across multiple sources required for MDM Troubleshoot and resolve complex data integration and synchronization challenges Develop and maintain technical documentation for MDM configurations and processes Provide technical guidance and support to team members and stakeholders Qualifications Required Qualifications Bachelor's degree in Computer Science, Information Systems, Life Sciences, or related field Expert-level knowledge of Reltio or Informatica MDM platforms Deep understanding of Master Data Management principles and best practices Comprehensive knowledge of life sciences data structures and regulatory requirements Strong background in data modeling, data integration, and data quality management Proficiency in SQL and database technologies Experience with data governance frameworks Knowledge of data privacy regulations (HIPAA) specific to life sciences Professional Experience 7+ years experience in data management technologies 3+ years of experience in Master Data Management, with a focus on life sciences or healthcare Proven track record of successful MDM implementations Experience configuring and managing complex data management solutions Demonstrated ability to work in cross-functional teams Preferred Qualifications Certifications in MDM technologies (Reltio, Informatica) Experience with additional enterprise data management tools Excellent analytical and problem-solving abilities Strong communication skills Ability to translate technical concepts to non-technical stakeholders Detail-oriented with high attention to data accuracy Adaptable and quick learner in a dynamic technology environment Additional Information What does IntegriChain have to offer? Mission driven: Work with the purpose of helping to improve patients' lives! Excellent and affordable medical benefits + non-medical perks including Flexible Paid Time Off and much more! Robust Learning & Development opportunities including over 700+ development courses free to all employees IntegriChain is committed to equal treatment and opportunity in all aspects of recruitment, selection, and employment without regard to race, color, religion, national origin, ethnicity, age, sex, marital status, physical or mental disability, gender identity, sexual orientation, veteran or military status, or any other category protected under the law. IntegriChain is an equal opportunity employer; committed to creating a community of inclusion, and an environment free from discrimination, harassment, and retaliation. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies