Home
Jobs

3523 Informatica Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324638 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 8 hours ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324631 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 8 hours ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324632 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 8 hours ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 8 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Company : Our client is a global IT, consulting, and business process services company headquartered in Bengaluru, India. It offers end-to-end IT services, including application development, infrastructure management, and digital transformation. They serves clients across industries such as banking, healthcare, retail, energy, and manufacturing. It specializes in modern technologies like cloud computing, AI, data analytics, and cybersecurity. The company has a strong global presence, operating in over 66 countries. Our client employs more than 250,000 people worldwide. It is known for helping enterprises modernize their IT infrastructure and adopt agile practices. Their division includes consulting, software engineering, and managed services. The company integrates automation and AI into its services to boost efficiency and innovation. Job Title: Datastage developer · Location: Pune(Hybrid) · Experience: 6+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: DataStage Developer Responsibilities: Reviewing and discussing briefs with key personnel assigned to projects. Designing and building scalable DataStage solutions. Configuring clustered and distributed scalable parallel environments. Updating data within repositories, data marts, and data warehouses. Assisting project leaders in determining project timelines and objectives. Monitoring jobs and identifying bottlenecks in the data processing pipeline. Testing and troubleshooting problems in ETL system designs and processes. Improving existing ETL approaches and solutions used by the company. Providing support to customers about issues relating to the storage, handling, and access of data. DataStage Developer Requirements: Bachelor's degree in computer science, information systems, or a similar field. Demonstrable experience as a DataStage developer. IBM DataStage certification or similar type of qualification. Proficiency in SQL or another relevant coding language. Experience or understanding of other ETL tools, such as Informatica, Oracle ETL, or Xplenty. Knowledge of data modeling, database design, and the data warehousing ecosystem. Skilled at the ideation, design, and deployment of DataStage solutions. Excellent analytical and problem-solving skills. The ability to work within a multidisciplinary team. Show more Show less

Posted 8 hours ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Senior Data Modelller – Telecom Domain About the Role: We are seeking an experienced Telecom Senior Data Modeller to join our team. In this role, you will be responsible for designing and standardization of enterprise-wide data models across multiple domains such as Customer, Product, Billing, and Network. The ideal candidate will work closely with cross-functional teams to translate business needs into scalable and governed data structures. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Responsibilities: Design logical and physical data models aligned with enterprise and industry standards Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create and maintain data models for Customer, Product, Usage, and Service domains Align models with TM Forum SID, telecom standards, and data mesh principles Translate business requirements into normalized and analytical schemas (Star/Snowflake) Define and maintain entity relationships, hierarchy levels (Customer - Account - MSISDN), and attribute lineage Standardize attribute definitions across systems and simplify legacy structures Collaborate with engineering teams to implement models in cloud data platforms (e.g., Databricks) Collaborate with domain stewards to simplify and standardize legacy data structures Work with governance teams to tag attributes for privacy, compliance, and data quality Document metadata, lineage, and maintain version control of data models Support analytics, reporting, and machine learning teams by enabling standardized data access Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Qualifications: Bachelor’s or master’s degree in computer science, Telecommunications Engineering, Data Science, or a related technical field 7+ years of experience in data modelling roles with at least 3-4 years in telecommunications industry Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Excellent understanding of TM Forum SID / eTOM / ODA Strong experience with data modeling tools (Azure Analysis services, SSAS, dbt, informatica) Hands-on experience with modern cloud data platforms (Databricks, Azure Synapse, Snowflake) Deep understanding of data warehousing concepts and normalized/denormalized models Proven experience in telecom data modeling (CRM, billing, network usage, campaigns) Expertise in SQL, data profiling, schema design, and metadata documentation Familiarity with domain-driven design, data mesh and modular architecture Experience in large-scale transformation or modernization programs Knowledge of regulatory frameworks such as GDPR or data privacy-by-design Background in telecom, networking or other data-rich industries Show more Show less

Posted 10 hours ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: PySpark Data Engineer Location: Pune (Hybrid) Contract 6 to 11 months Job Description: We are looking for a skilled PySpark Data Engineer with 2 to 4 years of experience. The ideal candidate should have strong expertise in building and optimizing data pipelines using PySpark and should have experience working on cloud platforms like Azure or AWS. Mandatory skills needed: Strong expertise on Pyspark Experience of Pyspark on Azure or AWS Good understanding of Informatica PC Good understanding of Data Warehousing concepts and SQL Excellent analytical skills, troubleshooting skills Excellent verbal and written communication skills Good to have skills: Basic Unix commands, and Shell scripting Version control tools like Git DevOps experience Show more Show less

Posted 10 hours ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Hiring for " Data Governance Lead" in a leading Bank. Location-Kolkata The Data Governance Lead will be responsible for establishing, implementing, and maintaining data governance policies, frameworks, and processes to ensure data quality, compliance, and security within the bank. Require any Bachelor’s/Master’s degree in Computer Science, Information Management, Data Science, Engineering, or a related field with Overall 10+ years of experience in data governance, data management, or related roles within the banking or financial services sector with experience in enterprise data integration and management and working with Data warehouse technologies and Data governance solutions. Having 3+ years of practical experience configuring business glossaries, making dashboards, creating policies etc and Executing at least 2 large Data Governance, Quality and Master Data Management projects from inception to production, working as technology expert. Also with 3+ years’ experience in Data Standardization, Cleanse, transform and parse data. Preferred Certifications-CDMP, DAMA, EDM Council, IQINT Proficiency in Data Governance/Quality tools(e.g., Collibra, Informatica, Alation etc.). Proficient in Data warehousing and pipelining&Good to have Data Engineering experience Experience in working on Python, Big Data, Spark etc. &Cloud platforms (AWS, GCP etc.) Show more Show less

Posted 10 hours ago

Apply

15.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

HCL Job Level : DGM - Data Management (Centre of Excellence) Domain : Multi Tower Role : Center of Excellence (Data Management) Role Location : Hyderabad , (Noida or Chennai secondary location). Positions : 1 Experience : 15+ years Job Profile Support Global Shared Services Strategy for Multi Tower Finance (P2P, O2C, R2R and FP&A) and Procurement tracks. Understand all processes in a detailed manner, inter-dependence, current technology landscape and organization structure Ensure end-to-end data lifecycle management including ingestion, transformation, storage, and consumption, while maintaining data reliability, accuracy, and availability across enterprise systems, with a strong focus on the Enterprise Data Platform (EDP) as the central data repository Collaborate with cross-functional teams to understand data requirements, identify gaps, and implement scalable solutions Define and enforce data quality standards, validation rules, and monitoring mechanisms, while leading the architecture and deployment of scalable, fault-tolerant, and high-performance data pipelines to ensure consistent and trustworthy data delivery Partner with IT and business teams to define and implement data access controls, ensuring compliance with data privacy and security regulations (e.g., GDPR, HIPAA Understand Governance and Interaction models with Client SMEs and drive discussions on project deliverables. Collaborate with business stakeholders to define data SLAs (Service Level Agreements) and ensure adherence through proactive monitoring and alerting Act as a bridge between business and IT, translating business needs into technical solutions and ensuring alignment with strategic goals Establish and maintain metadata management practices, including data lineage, cataloging, and business glossary development Propose feasible solutions, both interim and long term, to resolve the problem statements and address key priorities. Solutioning must be at a strategic level and at L2/ L3 Level Drive Alignment of processes, people, technology & best practices thereby enabling optimization, breaking silos, eliminating redundant methods and standardizing processes and Controls across entire engagement, on Data management. Identify process variations across regions and businesses and evaluate standardization opportunities through defining the Golden processes of Data collection and Data management. Required Profile/ Experience Deep understanding of all Finance towers and Procurement Strong understanding of data management principles, data architecture, and data governance Understanding and Hands-on experience with data integration tools, ETL/ELT processes, and cloud-based data platforms Demonstrate a proven track record in managing tool integrations and ensuring accurate, high-performance data flow, with strong expertise in data quality frameworks, monitoring tools, performance optimization techniques, and a solid foundation in data modeling, metadata management, and master data management (MDM) concepts Leadership Capability – should have relevant leadership experience in running large delivery operations and driving multiple enterprise level initiatives and Programs with High Business Impact. BPO Experience : Desired candidates should have relevant experience in BPO services especially in Americas. Transformation: Should have led and delivered at least 2-3 Data transformation Project regarding Application Integrations & Master Data management Tools and Industry Benchmarks – Should have knowledge of Industry wide trends on F&A Tools, platforms and benchmarks. (Azure Data Lake, AWS, GCP) Customer Facing skills: Should be proficient in leading meetings and presentations with customers using powerful product level material. Education Requirement B.E./B. Tech/MCA or equivalent in Computer Science, Information Systems, or related field Certifications in data management tools or platforms (e.g., Informatica, Talend, Azure Data Engineer, etc.) are preferred Show more Show less

Posted 10 hours ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits At Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking a Technical Business Analyst role who will play a crucial role in ensuring smooth and efficient data migration and integration between diverse systems with varying architectures, databases, and APIs. This role is primarily responsible for translating complex business requirements into actionable specifications for data engineers to build and implement data pipelines. Responsibilities Conduct thorough business analysis of source and target systems involved in data migrations and integrations. Develop a deep understanding of the functional and technical aspects of both systems, including their operational workflows and data structures. Identify and document system modules and their corresponding relationships between the two systems. Prepare migration/integration scoping documents that outline system objects to be migrated/integrated. Define and document detailed field-to-field data mapping for various objects, specifying how data fields from the source system map to the target system. Identify, analyze, and document migration criteria, considerations, limitations, and required data transformations. Collaborate with system owners, business stakeholders, and the data operations team to ensure migration requirements are fully captured and aligned with business objectives. Work closely with data engineers to facilitate automation of migration/integration processes. Support data validation and reconciliation efforts post-migration to ensure data accuracy and integrity. Maintain clear and structured documentation to support future migrations and integrations. The ideal candidate will bridge the gap between business and technical teams, ensuring successful and seamless data transfers. Competencies, Characteristics & Traits Mandatory Experience: Minimum 4 years if experience in preparing specifications and experience in liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Good knowledge of data migration and engineering processes and concepts Proficiency in SQL and data analysis tools Understanding of cloud and on-premises database technologies and application services Experience with agile project practices Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders. Critical Thinking and collaboration skills Ability to analyze complex data issues, identify root causes, and propose solutions Essential Skills and Experience Experience liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Proven experience working with relational databases (e.g., SQL Server, Oracle, MySQL), data structures and APIs. Good knowledge of data migration and engineering processes and concepts Experience with data modeling documentation and related tools Proficiency in SQL and data analysis tools Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders Desirable Understanding of cloud and on-premise database technologies and application services Experience with migration tools such as SnapLogic, Talend, Informatica, Fivetran, or similar. Industry-specific knowledge in Audit, Healthcare and Aviation is a plus Experience with agile project practices Business Analysis certifications (CBAP, CCBA, PMI-PBA) are a plus About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! Show more Show less

Posted 10 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are looking for SSIS Developer having experience in maintaining ETL solutions using Microsoft SQL Server Integration Services (SSIS) . The candidate should have extensive hands-on experience in data migration , data transformation , and integration workflows between multiple systems, including preferred exposure to Oracle Cloud Infrastructure (OCI) .. No. of Resources Required: 2 (1 resource with 5+ years exp and 1 resource with 3+ years exp). Please find the below JD for data migration role requirement. Job Description: We are looking for a highly skilled and experienced Senior SSIS Developer to design, develop, deploy, and maintain ETL solutions using Microsoft SQL Server Integration Services (SSIS) . The candidate should have extensive hands-on experience in data migration , data transformation , and integration workflows between multiple systems, including preferred exposure to Oracle Cloud Infrastructure (OCI) . Job Location : Corporate Office, Gurgaon Key Responsibilities: Design, develop, and maintain complex SSIS packages for ETL processes across different environments. Perform end-to-end data migration from legacy systems to modern platforms, ensuring data quality, integrity, and performance. Work closely with business analysts and data architects to understand data integration requirements. Optimize ETL workflows for performance and reliability, including incremental loads, batch processing, and error handling. Schedule and automate SSIS packages using SQL Server Agent or other tools. Conduct root cause analysis and provide solutions for data-related issues in production systems. Develop and maintain technical documentation, including data mapping, transformation logic, and process flow diagrams. Support integration of data between on-premises systems and Oracle Cloud (OCI) using SSIS and/or other middleware tools. Participate in code reviews, unit testing, and deployment support. Education: Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent practical experience). Required Skills: 3-7 years of hands-on experience in developing SSIS packages for complex ETL workflows . Strong SQL/T-SQL skills for querying, data manipulation, and performance tuning. Solid understanding of data migration principles , including historical data load, data validation, and reconciliation techniques. Experience in working with various source/target systems like flat files, Excel, Oracle, DB2, SQL Server, etc. Good knowledge of job scheduling and automation techniques. Preferred Skills: Exposure or working experience with Oracle Cloud Infrastructure (OCI) – especially in data transfer, integration, and schema migration. Familiarity with on-premises-to-cloud and cloud-to-cloud data integration patterns. Knowledge of Azure Data Factory, Informatica, or other ETL tools is a plus. Experience in .NET or C# for custom script components in SSIS is advantageous. Understanding of data warehousing and data lake concepts. If interested, Kindly revert back with resume along and below mentioned details to amit.ranjan@binarysemantics.com Total Experience.: Years of Experience in SSIS Development: Years of Experience in maintaining ETL Solution using SSIS: Years of Experience in Data Migration / Data transformation, and integration workflows between multiple systems: Years of Experience in Oracle Cloud Infrastructure (OCI) Current Location: Home town: Reason of change: Minimum Joining Time: Regards, Amit Ranjan Show more Show less

Posted 10 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 11 hours ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary: We are looking for a skilled ETL Tester with hands-on experience in validating data pipelines and data transformations in an AWS-based ecosystem . The ideal candidate should have a strong background in ETL testing, a solid understanding of data warehousing concepts, and proficiency with tools and services in AWS like S3, Redshift, Glue, Athena, and Lambda. Key Responsibilities: Design and execute ETL test cases for data ingestion, transformation, and loading processes. Perform data validation and reconciliation across source systems, staging, and target layers (e.g., S3, Redshift, RDS). Understand data mappings and business rules; write SQL queries to validate transformation logic. Conduct end-to-end testing including functional, regression, and performance testing of ETL jobs. Work closely with developers, data engineers, and business analysts to identify and troubleshoot defects . Validate data pipelines orchestrated through AWS Glue, Step Functions, and Lambda functions . Utilize Athena and Redshift Spectrum for testing data stored in S3. Collaborate using tools like JIRA, Confluence, Git, and CI/CD pipelines . Prepare detailed test documentation including test plans, test cases, and test summary reports. Required Skills: 3–4 years of experience in ETL/Data Warehouse testing . Strong SQL skills for data validation across large datasets. Working knowledge of AWS services such as S3, Redshift, Glue, Athena, Lambda, CloudWatch. Experience testing batch and streaming data pipelines . Familiarity with Python or PySpark is a plus for data transformation or test automation. Experience in using ETL tools (e.g., Informatica, Talend, or AWS Glue ETL scripts). Knowledge of Agile/Scrum methodology . Understanding of data quality frameworks and test automation practices . Good to Have: Exposure to BI tools like QuickSight, Tableau, or Power BI. Basic understanding of data lake and data lakehouse architectures . Experience in working with JSON, Parquet , and other semi-structured data formats. Show more Show less

Posted 11 hours ago

Apply

0.0 - 6.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 12 hours ago

Apply

15.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Title : Technical Architect - Data Governance & MDM Experience: 15+ years Location: Mumbai/Pune/Bangalore Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills. 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria 15+ years of total experience. Bachelor’s degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Interested candidates can apply directly. Alternatively, you can also send your resume to ansari.m@atos.net Show more Show less

Posted 12 hours ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 12 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 18th & 19th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 18th & 19th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 13 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 13 hours ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Key Responsibilities Develop and execute test cases for ETL processes and data migration across large datasets. Perform data validation and validation of source-to-target data mappings using advanced SQL. Collaborate with developers, Business Analysts (BAs), and Quality Assurance (QA) teams to ensure data quality and integrity. Report and track defects, ensuring timely resolution to maintain data accuracy and quality. Automate and optimize data workflows and ETL pipelines. Monitor the performance of data pipelines and troubleshoot any data issues as needed. Maintain detailed documentation for data processes, workflows, and system architecture. Ensure the data quality, integrity, and security across Skills & Qualifications : Experience in ETL/Data Warehouse testing or a similar role. Strong proficiency in SQL with a solid understanding of database concepts. Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar. Experience with data warehousing platforms (e.g., Snowflake) and performance tuning. Experience in defect tracking and issue management using tools like JIRA. Familiarity with version control systems (e.g., Git) and CI/CD practices. Good communication, collaboration, and documentation skills. Solid understanding of data warehousing principles and ETL process design. (ref:hirist.tech) Show more Show less

Posted 16 hours ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Major Duties Monitor the production environment. Identify and implement opportunities to improve production stability. Ensure incidents are prioritized and worked on in proper order and review backlog items. Investigating, diagnosing, and solving application issues. Problem resolution in an analytical and logical manner, to troubleshoot root cause and resolve production incidents. Follow-up on cross-team incidents to drive to resolution. Developing and delivering product changes, enhancements in a collaborative, agile team environment. Build solutions to fix production issues and participate in ongoing software maintenance activities. Understand, define, estimate, develop, test, deploy and support change requests. Monitor and attend to all alerts and escalate production issues as needed to relevant teams and management. Operates independently; has in-depth knowledge of business unit / function. Communicate with stakeholders and business on escalated items. As subject area expert, provides comprehensive, in-depth consulting to team and partners at a high technical level. Develops periodic goals, organizes the work, sets short-term priorities, monitors all activities, and ensures timely and accurate completion of the work. Periodically engage with business partners to review progress and priorities and develop and maintain rapport through professional interactions with clear, concise communications. Ensure cross-functional duties, including bug fixes & scheduling changes etc. are scheduled and completed by the relevant teams. Work with the team to resolve problems, improve production reliability, stability, and availability. Follow the ITIL processes of Incident, Problem & Change Management. Ability to solve complex technical Have : 8 -12 years of professional experience in software maintenance / support / development with Programming / Strong Technical background. 80% Technical and 20% Manager skills. Proficient in working with ITIL / ITSM (Service Now) & Data Analysis. Expert on Unix commands and Scripting. Working knowledge of SQL (Preferably Oracle, MSSQL). Experience in supporting ETL/EDM/MDM Platform using tools like SSIS, Informatica, Markit EDM, IBM Infosphere DataStage ETL experience is mandate if EDM experience is not present. Understanding of batch scheduling system usage and implementation concepts. Trigger solutions using external schedulers (Control-M), services (Process Launchers & Event Watchers) and UI. Well versed with Change Management process and tools. Experience in incident management, understanding of ticket workflows and use of escalation. Good understanding of MQ/Kafka (both consumer/producer solutions). Good understanding of Rest/SOAP to Have : Proficient in Java and able to go into code to investigate and fix issues. Understanding of DevOps, CICD & Agile techniques preferred. Basic understanding of front-end technologies, such as React JS, JavaScript, HTML5, and CSS3. Banking and Financial Services Knowledge is preferred. More importantly, the candidate should have a strong technical background. (ref:hirist.tech) Show more Show less

Posted 16 hours ago

Apply

14.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Position Overview We are seeking a dynamic and experienced Program Manager to lead and oversee the Data Governance Program for a large banking organization. The Program Manager will be responsible for the successful execution of data governance initiatives, ensuring compliance with regulatory requirements, promoting data quality, and fostering a culture of data stewardship across the enterprise. This role requires a strategic thinker with exceptional leadership, communication, and organizational skills to align cross-functional teams and drive the adoption of governance frameworks. Key Responsibilities Program Leadership : Develop and execute a comprehensive Data Governance strategy aligned with the organization's objectives and regulatory requirements. Act as a liaison between senior leadership, stakeholders, and cross-functional teams to ensure program alignment and success. Drive organizational change to establish a culture of data governance and stewardship. Great focus on program risk identification and timely reporting and devising action to address it. Cost benefit analysis and justification to investments. Planning And Project Management Project planning, scheduling & tracking Work prioritization and resource planning Risk identification and reporting Team planning and management Status reporting Governance Framework Implementation Establish and manage a robust Data Governance framework, including policies, standards, roles, and responsibilities. Implement data cataloging, metadata management, and data lineage tools to enhance data visibility and accessibility. Oversee the creation of workflows and processes to ensure adherence to governance policies. Stakeholder Engagement Reports to CXO level executives with program status update, risk management and outcomes. Collaborate with business units, IT teams, and compliance officers to identify governance priorities and resolve data-related challenges. Facilitate the Data Governance Council meetings and ensure effective decision-making. Serve as a point of contact for internal and external auditors regarding data governance-related queries. Compliance And Risk Management Ensure adherence to industry regulations and banking-specific compliance requirements. Identify and mitigate risks related to data usage, sharing, and security. Monitoring And Reporting Develop key performance indicators (KPIs) and metrics to measure the effectiveness of the Data Governance Program. Provide regular updates to CXO level executive leadership on program status, risks, and outcomes. Prepare and present audit and compliance reports as required. Team Leadership And Mentorship Lead cross-functional teams, including data stewards, analysts, and governance professionals. Provide training and mentoring to promote awareness and understanding of data governance practices. Technical Expertise Understanding of data engineering principles and practices: Good understanding of data pipelines, data storage solutions, data quality concepts, and data security is crucial. Familiarity with data engineering tools and technologies: This may include knowledge of ETL/ELT tools, Informatica IDMC, MDM, data warehousing solutions, Collabra data quality, cloud platforms (AWS, Azure, GCP), and data governance frameworks Qualifications Bachelor's degree in computer science, Data Management, Business Administration, or a related field; MBA or equivalent experience preferred. 14+ years of experience in program management, with at least 6+ years focused on data governance or data management with MDM in the banking or financial services sector. Strong knowledge of data governance frameworks, principles, and tools (e.g., Collibra, Informatica, Alation). Experience with regulatory compliance requirements for the banking industry, such as GDPR, CCPA, BCBS 239, and AML/KYC regulations. Proven track record of successfully managing large, complex programs with cross-functional teams. Excellent communication and stakeholder management skills, with the ability to influence and align diverse groups. Familiarity with data analytics, data quality management, and enterprise architecture concepts. Certification in program or project management (e.g., PMP, PRINCE2) or data governance (e.g., DGSP, CDMP) is a plus. Key Competencies Strong strategic thinking and problem-solving skills. Ability to work under pressure and manage multiple priorities. Exceptional leadership and interpersonal skills. Proficiency in program management tools and methodologies. Strong analytical and decision-making capabilities (ref:hirist.tech) Show more Show less

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows. Show more Show less

Posted 18 hours ago

Apply

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Sr Devloper with special emphasis and experience of 8 to 10 years on Pyspark and Python along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans.8-10 years of experience in designing and developing Pyspark applications and ETL Jobs using ETL Tools. 5+ years of sound knowledge on Pyspark to implement ETL logics. Strong understanding of frontend technologies such as HTML, CSS, React & JavaScript. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration. Show more Show less

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Salesforce consulting generalist at PwC, you will possess a broad range of consulting skills and experience across various Salesforce applications. You will provide consulting services to clients, analysing their needs, implementing software solutions, and offering training and support for effective utilisation of Salesforce applications. Your versatile knowledge will allow you to assist clients in optimising operational efficiency and achieving their strategic objectives. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Job Title: Salesforce Lightning, LWC Developer Job Level - Sr. Associate Year of Experience –4 Yrs – 8 Yrs Educational Qualifications : BE / B Tech / MCA/ M.Sc / M.E / M.Tech Key Skills : Salesforce, Lightning, LWC, Job Description 4+ Years of Total IT experience. 4+ years of SFDC experience. Extensive experience in Force.com platform using APEX and Visualforce. Solid Implementation experience using Sales / Service / Custom cloud. Experience in working with HTML, CSS, Ajax, JavaScript , JQuery. Must have Field service Lightning tool configuration experience. Must have Salesforce Field service Lightning Technical/Functional Skill. Must have Hands on Customization APEX, Visual Force, Workflow/ Process Builder, Triggers, Batch, Schedule Apex, VF Components, Test Class , Web services/APEX/REST etc Additional Desired Skills Good working knowledge in Object Oriented programming like Java, Ruby, C++. Experience in working with Bootstrap, Angular JS. Experience in working with Lightning and design components. Experience in marketing tools like Marketing Cloud, Exact Target, Eloqua Experience in products like Apttus, Veeva, nCino, Adobe Flex Able to handle data management inclusive of data load, data translation, data hygiene, data migration and integration. Proven ability to look at technical processes from a strategic standpoint and understand the inter-relationships. Recommend to team members or customers the appropriate and optimal use/configuration of a custom build solution. Exemplary enthusiast for code honesty, code modularity, code cleanliness and version control. Familiarity building custom solutions on: SAP, Oracle, MS-SQL Server, or other RDMS. Understanding of integration platforms such as, but not limited to: Cast Iron, Boomi, Informatica, Tibco, and Fusion. Able to translate the customer requirements and gap/fit analysis in to comprehensible functional configuration of Salesforce.com. Proven track record of writing, interpreting and managing deliverables of a consulting engagement. Must be able to think independently and creatively. Aptitude for taking on technical challenges. Awareness of the changing Cloud ecosystem and adjust to new technologies, methods and apps ________________________________________________________________________________ Show more Show less

Posted 22 hours ago

Apply

3.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Snowflake Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 3-6 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating any ETL tool like DataStage or Informatica with Snowflake. Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300082 Show more Show less

Posted 23 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies