Jobs
Interviews

439 Icc Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Kochi, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0 - 3 years

0 Lacs

Banjara Hills, Hyderabad, Telangana

Work from Office

Job Title: Infection Control Nurse (ICN) Job Summary: The Infection Control Nurse is responsible for the surveillance, prevention, and control of infection within the healthcare facility. The role involves monitoring infection data, educating healthcare staff, implementing infection control protocols, and ensuring compliance with national and international standards and regulations. Key Responsibilities:1. Infection Surveillance and Reporting Monitor and collect data on healthcare-associated infections (HAIs). Conduct regular rounds in all hospital departments to identify potential infection risks. Maintain infection surveillance systems and records. Report infections to internal leadership and, where required, to public health authorities. 2. Policy Development and Implementation Develop, implement, and review infection control policies and procedures. Ensure policies align with CDC, WHO, and local public health guidelines. Assist in developing protocols for sterilization, disinfection, isolation, and waste management. 3. Staff Training and Education Conduct regular training and orientation programs for healthcare staff on infection prevention and control. Provide education on proper hand hygiene, PPE usage, and safe clinical practices. Organize drills and workshops related to outbreak preparedness and containment. 4. Outbreak Investigation and Management Investigate infection outbreaks or clusters within the facility. Work with the infection control committee and hospital administration to implement corrective actions. Prepare detailed reports and conduct root cause analysis. 5. Compliance and Auditing Perform infection control audits and inspections of all departments. Monitor and ensure adherence to hand hygiene and PPE protocols. Maintain documentation for accreditation and regulatory inspections (e.g., NABH, JCI). 6. Collaboration Coordinate with the Infection Control Committee (ICC), microbiology department, and other clinical staff. Work with housekeeping, CSSD (Central Sterile Services Department), and maintenance teams to ensure infection control in all environments. Participate in quality improvement initiatives related to infection control. Qualifications: Education: BSc Nursing or GNM; certification in Infection Control is preferred. Experience: 2–3 years in clinical nursing; 1+ year in infection control or quality assurance (preferred). Skills: Knowledge of infection prevention practices and epidemiology. Strong observational, analytical, and communication skills. Ability to train and influence staff behavior. Workplace: Hospitals, Reporting to: Infection Control Officer / Quality Head / Nursing Superintendent Job Types: Full-time, Permanent, Fresher Pay: From ₹25,000.00 per month Benefits: Health insurance Life insurance Paid sick time Provident Fund Schedule: Day shift Fixed shift Morning shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person

Posted 2 months ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

Work from Office

TFL – Trade Associate Job ID: R0381626 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-05-13 Location: Bangalore Position Overview Job Title: TFL – Trade Associate Location: Bangalore, India Role Description It’s a popular perception that ‘if you have experience in Trade Finance Operations, you are never out of job’. We handle multiple products like Letter of Credit, Collections, Bank Guarantees etc. Depending on your appetite to learn, you will get enough opportunities to learn multiple products/processes. The learning never ends in Trade Finance Operations. Our subject matter experts will ensure that you get the necessary training on the products and processes. What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Day to day management of Process Service Delivery Serve as Internal/external escalation point Serve as technical operational expert for queries from clients/team members etc. Monitoring and controlling workflows Spearhead various process improvement initiatives Ensuring closure of tasks within prescribed SLAs Identifying operational risks proactively and mitigating appropriately. Offering guidance with individual and team challenges Effective Multitasking Effective Delegation to direct reports Ensure structured upward & downward communication Assist with recruitment and training process Supervise and manage development of team members Drive projects and efficiency initiatives Ensure adequate back-ups created for all critical positions and assist other teams during contingencies (Staff shortage, high volumes etc). Work very closely with the process owners/stakeholders and other internal clients for overall growth of the bank’s business Your skills and experience Possesses adequate understanding of Trade related rules and guidelines as commissioned by ICC (ICC/UCP/URC etc) Good understanding of legal, credit and operational risks in handling of Trade product/services Good communication skills (oral and written) Flexible to work in late night shifts. CDCS Certification is an added advantage How we’ll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

Work from Office

TFL – Trade NCT Job ID: R0381623 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-05-13 Location: Bangalore Position Overview Job Title: TFL – Trade NCT Location: Bangalore, India Role Description It’s a popular perception that ‘if you have experience in Trade Finance Operations, you are never out of job’. We handle multiple products like Letter of Credit, Collections, Bank Guarantees etc. Depending on your appetite to learn, you will get enough opportunities to learn multiple products/processes. The learning never ends in Trade Finance Operations. Our subject matter experts will ensure that you get the necessary training on the products and processes. What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Handle the day to day processing of Collections, Letter of credit and Bank Guarantees as part of trade operation team in Delivery Hub, to meet agreed customer service level agreements and review outstanding transactions. Manage and ensure compliance (KOP, Ops manual etc.) with internal policies and audit and regulatory requirements Support and achieve excellent partnership with branch operations, and respective sales staff Your skills and experience Possesses adequate understanding of Trade related rules and guidelines as commissioned by ICC (ICC/UCP/URC etc) Good understanding of legal, credit and operational risks in handling of Trade product/services Good communication skills (oral and written) Flexible to work in late night shifts. How we’ll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

Work from Office

TFL – Trade Associate Job ID: R0381628 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-05-13 Location: Bangalore Position Overview Job Title: TFL – Trade Associate Location: Bangalore, India Role Description It’s a popular perception that ‘if you have experience in Trade Finance Operations, you are never out of job’. We handle multiple products like Letter of Credit, Collections, Bank Guarantees etc. Depending on your appetite to learn, you will get enough opportunities to learn multiple products/processes. The learning never ends in Trade Finance Operations. Our subject matter experts will ensure that you get the necessary training on the products and processes. What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Day to day management of Process Service Delivery Serve as Internal/external escalation point Serve as technical operational expert for queries from clients/team members etc. Monitoring and controlling workflows Spearhead various process improvement initiatives Ensuring closure of tasks within prescribed SLAs Identifying operational risks proactively and mitigating appropriately. Offering guidance with individual and team challenges Effective Multitasking Effective Delegation to direct reports Ensure structured upward & downward communication Assist with recruitment and training process Supervise and manage development of team members Drive projects and efficiency initiatives Ensure adequate back-ups created for all critical positions and assist other teams during contingencies (Staff shortage, high volumes etc). Work very closely with the process owners/stakeholders and other internal clients for overall growth of the bank’s business Your skills and experience Possesses adequate understanding of Trade related rules and guidelines as commissioned by ICC (ICC/UCP/URC etc) Good understanding of legal, credit and operational risks in handling of Trade product/services Good communication skills (oral and written) Flexible to work in late night shifts. CDCS Certification is an added advantage How we’ll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

3 - 8 years

25 - 30 Lacs

Pune

Work from Office

Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence Immerse yourself in the design, development, modification, debugging and maintenance of our clients software systems! Engage with specific modules, applications or technologies, and look after sophisticated assignments during the software development process. What will your job look like Proficiency in Shell Scripting: Capable of configuring application product builds using C/C++ and COBOL through advanced shell scripting techniques. Expertise in ICC Compiler : Skilled in utilizing ICC compiler to compile and configure application files for product builds. Installation & Configuration of Build Tools : Experienced in installing and configuring third-party software such as Ant and Make for building application products. C/C++ Build Knowledge: Proficient in building C/C++ files to generate objects, libraries, and binaries. Tivoli Workload Scheduler (TWS): Expertise & Experience in GITHUB repository management. System Integration: Experience in integrating C/C++ and vCOBOL applications with other systems, softwares & technologies (e.g., CORBA, WebSphere MQ). Installation and configuration of 3rd party softwares like Ant/Maven, Make, Websphere MQ, ICC Compiler, vCOBOL and Tivoli Workload Scheduler. Advanced Debugging Skills: Possesses excellent debugging skills for identifying and resolving issues in code. Support software configuration management and third-party software integration. Secondary Skills: Good Experience on Middleware technologies (Weblogic, Wildfly, Apache Tomcat, IBM Websphere MQ). Experience in Wildfly and WebLogic Administration which includes setting up clustered Wildfly & WebLogic Application Servers, JDBC/Database/Driver, Node Manager, Transactions, JVM Memory Tuning and deployments and SSL configuration. WebSphere MQ Expertise: Proficient in installing WebSphere MQ and configuring queue managers, listeners, and channels. Manage and maintain various environments including production, training, and load testing. Monitor systems, analyze issues, and provide performance measurement. Assist in error resolution across all environments and support during incident resolution. All you need is... Bachelor s degree in science / IT / Computing or equivalent. Proficiency in Shell Scripting, Jenkins and Automation Good knowledge of Unix, Linux and Solaris platform Good troubleshooting skills & communication skills. Knowledge on the batch job scheduler. Added advantage: DevOps tools like Jenkins (CI-CD), Ansible, Maven, Git, Packer Automation experience Azure Candidate should have: Good communication skill. Provide 24X7 support on rotational basis. Why you will love this job: You will be responsible for the integration between a major product infrastructure system and the Amdocs infrastructure system, driving automation helping teams work smarter and faster. Be a key member of an international, highly skilled and encouraging team with various possibilities for personal and professional development! You will have the opportunity to work in multinational environment for the global market leader in its field. We are a dynamic, multi-cultural organization that constantly innovates and empowers our employees to grow. Our people our passionate, daring, and phenomenal teammates that stand by each other with a dedication to creating a diverse, inclusive workplace! We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave!

Posted 2 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

Company Qualcomm India Private Limited Job Area Engineering Group, Engineering Group > Hardware Engineering General Summary As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Hardware Engineer, you will plan, design, optimize, verify, and test electronic systems, bring-up yield, circuits, mechanical systems, Digital/Analog/RF/optical systems, equipment and packaging, test systems, FPGA, and/or DSP systems that launch cutting-edge, world class products. Qualcomm Hardware Engineers collaborate with cross-functional teams to develop solutions and meet performance requirements. Minimum Qualifications Bachelor's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 3+ years of Hardware Engineering or related work experience. OR Master's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 2+ years of Hardware Engineering or related work experience. OR PhD in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 1+ year of Hardware Engineering or related work experience. 5+ years of experience in static timing analysis, constraints and other physical implementation aspects.Solid understanding industry standard tools PT, Tempus, GENUS, Innovus, ICC etc.Solid grip on STA fixing aspects to solve extreme critical timing bottleneck paths.Should have experienced about preparing complex ECOs for timing convergence [ across huge set of corners] through Tweaker / Tempus / Physical PT ECOs.Should be aware about the tricks for minimizing power.Experience in deep submicron process technology nodes is strongly preferred.Knowledge of high performance and low power implementation methods is preferred.Willing to push PPA to the best possible extent.Strong fundamentals.Expertise in Perl, TCL language Applicants: Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies: Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers. 3074893

Posted 2 months ago

Apply

3 years

0 Lacs

Gurugram, Haryana, India

On-site

Company Description WNS Global Services Inc. (NYSE: WNS) is a global Business Process Management (BPM) leader. WNS offers business value to 400+global clients by combining operational excellence with deep domain expertise in key industry verticals, including Banking and Financial Services, Consulting and Professional Services, Healthcare, Insurance, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Telecommunications, Travel and Utilities. Globally, the group’s over 44,000+ Professionals serve across 60 delivery centers in 16 countries worldwide. Our mission as an organization is guided by our CIRCLE of Values: Client First, Integrity, Respect, Collaboration, Learning, Excellence. Job Description Transaction monitoring on all payment requests. Monitoring payments to sanctioned countries. Generating suspicious transaction reports and suspicious activity reports Assist banking customers who are victims of fraud, theft or identity theft. Complete understanding of TBML(Trade Based Money Laundering) Ability to comply with EU/OFAC and UN Sanctions. Well versed with the ICC rules(UCP 600, ISBP, URC, URR, ISP 98, Incoterms and URDG) . Thorough knowledge on SWIFT messages in the MT4xx and MT7xx series along with MT103 and MT202 Qualifications Graduate Minimum 2 years of experience into AML (Transaction Monitoring) or Minimum 3 years of experience in Trade Finance with Knowledge of UCP 600, MT103, MT203, Issuance of LC, Guarantees

Posted 2 months ago

Apply

0 years

0 Lacs

India

Remote

🚨Join us as Women Safety Educator🚨 💡 Be the Voice. Lead the Change. Empower Women at Workplaces! About the Concept of Living Charitable Trust Concept of Living Charitable Trust is one of India’s leading and No/1 NGOs committed to spreading awareness and ensuring the safety and security of women at workplaces. We strongly believe in the power of education and legal awareness to build safer, harassment-free environments across corporate and government sectors. To strengthen this mission, we are looking for passionate and knowledgeable Women Safety Educators who can train, guide, and support organizations in implementing the POSH Act and workplace safety policies.This is a freelance role with attractive remuneration and the flexibility to work at your convenience while making a real difference in society! 🌟 Your Role as a Women's Safety Educator 🌟🔹 Conduct FREE OF COST training sessions on the POSH Act (Prevention of Sexual Harassment at Workplace) & workplace safety.🔹 Offer free awareness training to government and non-government organizations to ensure compliance.🔹 Educate organizations, HR teams, and employees on their legal responsibilities and best practices for women’s safety.🔹 Guide employers on POSH compliance and help them establish an Internal Complaints Committee (ICC).🔹 Maintain professional relationships with government officials and organizations to strengthen the network for women’s safety.🔹 Act as a change leader in driving safer workplace cultures across industries. ✅ Why Should You Apply?🌍 Make a real social impact—be a part of a national movement for women’s safety.💰 Handsome remuneration on each training session—earn well while creating change.⏳ Freelance & flexible schedule—train at your convenience.🏢 Work with top government & corporate organizations—gain valuable exposure.📈 Opportunity to grow your professional network in legal, HR, and corporate sectors.🎯 Be recognized as an expert in workplace safety & compliance—enhance your credentials. 🎯 Who Should Apply?We are looking for individuals with:✔️ A passion for empowering women and ensuring safer workplaces.✔️ Excellent communication & training skills to engage diverse audiences.✔️ The ability to guide organizations & HR professionals on legal responsibilities. If you are passionate about women's safety, workplace security, and legal compliance, we invite you to become a Women's Safety Educator with us!💌 Drop your resume & application now! 📢 Tag & Share! Let’s spread the word and work together to create a safer, harassment-free workplace culture across India!#WomenSafety #POSHAct #CareerOpportunity #NGOJobs #LegalAwareness #SocialImpact #WorkplaceSafety #Compliance #FreelanceOpportunity #MakeAnImpact

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies