Jobs
Interviews

1656 Adf Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Yamunanagar, Haryana, India

On-site

We’re Hiring: Sr. Data Analytics & Quality Engineer | Hyderabad (Work from Office) 🚨 Are you an experienced Data Analytics & Quality Engineer with a passion for delivering high-quality, validated data in a fast-paced environment? We’re looking for a strong professional to join our dynamic team in Hyderabad! 💡 Role: Sr. Data Analytics and Quality Engineer 📍 Location: Hyderabad (Work from Office) 🔧 Must-Have Skills: • Programming: SQL (T-SQL or PL/SQL) • Experience: QA process, ETL testing, Data validation, Data quality, RCM knowledge,Power BI , DAX • Testing Tools (Nice to Have): Great Expectations, Deequ, dbt (with tests), Pytest • Domain Expertise: US Healthcare (preferred) • Tools: SSMS, TOAD, SSIS, ADF, BI tools (Power BI, Tableau), Snowflake (nice to have) •Processes: Agile methodology, CI/CD integration, Test strategy & planning, Data reconciliation • Bonus: Python knowledge, Mentoring experience

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

About the Role: We're hiring 2 Cloud & Data Engineering Specialists to join our fast-paced, agile team. These roles are focused on designing, developing, and scaling modern, cloud-based data engineering solutions using tools like Azure, AWS, GCP, Databricks, Kafka, PySpark, SQL, Snowflake, and ADF. Key Responsibilities: Develop and manage cloud-native solutions on Azure or AWS Build real-time streaming apps with Kafka Engineer services using Java and Python Deploy and manage Kubernetes-based containerized applications Process big data using Databricks Administer SQL Server and Snowflake databases, write advanced SQL Utilize Unix/Linux for system operations Must-Have Skills: Azure or AWS cloud experience Kafka, Java, Python, Kubernetes Databricks, SQL Server, Snowflake Unix/Linux commands. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote.

Posted 1 month ago

Apply

0 years

4 - 12 Lacs

Thanjāvūr

On-site

Company Description AGF Tecnik, a division of AGF Group Inc., was established in India in the spring of 2018. Our team excels in technical drawing (reinforcement steel drawing) and provides estimation and detailing services for diverse construction projects. AGF Group specializes in reinforcing steel, post-tensioning, and scaffolding & access solutions, with over 40 business units in multiple countries including Canada, the United States, and the UAE. AGF Tecnik's headquarters is based in Longueuil. Follow us on LinkedIn to stay updated with AGF Group’s latest news. Role Description An Oracle Technical Consultant designs, develops, implements, and supports Oracle applications, focusing on technical aspects like database management, integrations, and customizations to meet business needs. Here's a more detailed breakdown of the role: Key Responsibilities: Technical Expertise: Proficiency in Oracle database technologies (e.g., SQL, PL/SQL). Experience with Oracle applications (e.g., Oracle Fusion, Oracle E-Business Suite). Knowledge of Oracle Cloud solutions (e.g., PaaS, IaaS). Understanding of data modeling, data administration, and software development principles. Implementation and Support: Participate in the entire project lifecycle, from requirements gathering to deployment and post-implementation support. Develop and customize Oracle applications to meet specific business requirements. Design and implement interfaces, integrations, and workflows. Troubleshoot and resolve technical issues. Conduct testing and quality assurance. Consulting and Collaboration: Work closely with functional consultants, business users, and other stakeholders. Provide technical guidance and expertise to clients and internal teams. Document technical solutions and processes. Stay updated with the latest Oracle technologies and best practices. Specific Skills: Experience with Oracle Forms & Reports, XML / BI Publisher Reporting Tools, Interfaces (outbound /Inbound) and Workflow. Knowledge of Oracle Fusion modules (Finance, SCM, Manufacturing) and tools. Familiarity with Apex development processes. Experience with FRICEW components. Knowledge of OIC, ADF, ODI, Informatica, OBIEE, SOA or Oracle Retail Cloud Infrastructure will be an added advantage. Soft Skills: Strong communication and interpersonal skills. Problem-solving and analytical skills. Ability to work independently and as part of a team. Ability to manage multiple projects and priorities. Job Types: Full-time, Permanent Pay: ₹469,520.39 - ₹1,219,504.42 per year Benefits: Health insurance Leave encashment Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Yearly bonus Work Location: In person

Posted 1 month ago

Apply

5.0 years

14 Lacs

Ahmedabad

On-site

Required Experience: 5+ years Roles and responsibilities: ● Design and implement end-to-end data solutions using Microsoft Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS), Azure Databricks, and SSIS. ● Develop complex transformation logic using SQL Server, SSIS, and ADF, and develop ETL Jobs/Pipelines to execute those mappings concurrently. ● Maintain and enhance existing ETL Pipelines, Warehouses, and Reporting leveraging traditional MS SQL Stack ● Understanding of REST API principles and creating ADF pipelines to handle HTTP requests for APIs. ● Well-versed with best practices for development, deployment of SSIS packages, SQL jobs, and ADF pipelines. ● Implement and manage source control practices using GIT within Azure DevOps to ensure code integrity and facilitate collaboration ● Participate in the development and maintenance of CI/CD pipelines for automated testing and deployment of BI solutions Preferred skills, but not required: ● Understanding of the Azure environment and developing Azure Logic Apps and Azure Function Apps. ● Understanding of Code deployment, GIT, CI/CD, and deployment of developed ETL code (SSIS, ADF). Job Types: Contractual / Temporary, Freelance Contract length: 6 months Pay: Up to ₹120,000.00 per month Work Location: In person

Posted 1 month ago

Apply

7.0 years

1 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience working with Azure, Databricks, and ADF, Data Lake 5+ years of experience working with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Key Skill: Azure Data Engineer - Azure Databricks, Azure Data factory, Python/Pyspark, Terraform At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Bihar

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. ͏ Wipro Limited is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com Mandatory Qualifications: Oracle ADF: Proven expertise in developing applications using ADF, ADF Faces, and ADF Task Flows. Java/J2EE: Strong foundation in Java programming and J2EE technologies for building enterprise-level applications. Web Technologies: Proficiency in JavaScript, HTML5, and CSS for front-end development. Web Services: Experience with building and consuming web services. MVC Architecture: Understanding and experience with implementing Model-View-Controller (MVC) patterns using ADF. JDeveloper: Proficiency in using JDeveloper for development and debugging. Application Server: Experience with WebLogic application server. Experience with systems support. Degree in Information Technology. Wipro is an egalitarian company that offers employment opportunities to all people, running a selection process that does not consider race, gender, nationality, ancestry, disability, sexual orientation or any other status protected by applicable law. Job Description ͏ ͏ ͏ Mandatory Skills: Oracle Public Sector Revenue Management. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Hyderabad

Work from Office

HI ALL, Please find below JD Position : Oracle Fusion Technical Location Hyderabad Experience: 8 to 15Years Position: Fulltime/C2H Education: (BTech\MTech\MCA\BSC\BCA) Job Description: Technical Consultant Should have worked on FBDI templates Worked on either SCM or CRM Oracle SOA/BPEL/OIC is a must Should have knowledge on Data integrator Working experience on OTBI/BI publisher Working experience on Data loading or Data migration or Conversions. Knowledge on Business Units/Legal entities/Chart of accounts

Posted 1 month ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Snowflake,AWS,analytics,sales,sql,data,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Looking for a dynamic, energetic, and self-driven Azure Data Engineer to design, implement, and maintain scalable data solutions. The role focuses on integrating data from multiple sources across all business units within India for Saint Gobain to enable seamless analytics and decision-making. Key Responsibilities : Designing, building, and managing data pipelines using Azure Data Factory and Azure Databricks Creating, managing tables in Azure Databricks' Unity catalog schemas and writing complex SQL queries for ETL and unit testing Monitoring and troubleshooting pipeline failures and fixing issues. Key Technical Skills: Azure Databricks (preferably with knowledge in Unity catalog, Foreign Catalog, SQL warehouses and PySpark), ADLS Gen 2, ADF. Good to have Microsoft DP-203 and/or DP-900 certification (not mandatory). Key Soft Skills: Strong team player with excellent interpersonal skills, Business process depth in Manufacturing set-up is preferred. Experience: 3 -5 years of relevant experience in Azure data engineering or similar roles. Women candidate are encouraged to apply.

Posted 1 month ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Profile – Lead Data Engineer Does working with data on a day to day basis excite you? Are you interested in building robust data architecture to identify data patterns and optimise data consumption for our customers, who will forecast and predict what actions to undertake based on data? If this is what excites you, then you’ll love working in our intelligent automation team. Schneider AI Hub is leading the AI transformation of Schneider Electric by building AI-powered solutions. We are looking for a savvy Data Engineer to join our growing team of AI and machine learning experts. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software engineers, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities Create and maintain optimal data pipeline architecture; assemble large, complex data sets that meet functional / non-functional requirements. Design the right schema to support the functional requirement and consumption patter. Design and build production data pipelines from ingestion to consumption. Create necessary preprocessing and postprocessing for various forms of data for training/ retraining and inference ingestions as required. Create data visualization and business intelligence tools for stakeholders and data scientists for necessary business/ solution insights. Identify, design, and implement internal process improvements: automating manual data processes, optimizing data delivery, etc. Ensure our data is separated and secure across national boundaries through multiple data centers Requirements and Skills You should have a bachelors or master’s degree in computer science, Information Technology or other quantitative fields You should have at least 8 years working as a data engineer in supporting large data transformation initiatives related to machine learning, with experience in building and optimizing pipelines and data sets Strong analytic skills related to working with unstructured datasets. Experience with Azure cloud services, ADF, ADLS, HDInsight, Data Bricks, App Insights etc Experience in handling ETL’s using Spark. Experience with object-oriented/object function scripting languages: Python, Pyspark, etc. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. You should be a good team player and committed for the success of team and overall project. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

About The Role We are seeking a highly skilled and experienced Senior Power BI Engineer to join our Data & Analytics team. You will play a pivotal role in designing, developing, and optimizing advanced Power BI reports and dashboards that empower business users with actionable insights. The ideal candidate will have deep expertise in Power BI, data modelling, and visualization, combined with a strong understanding of modern cloud data platforms, especially within the Azure ecosystem. Key Responsibilities Design, develop, and maintain scalable and high-performing Power BI reports and dashboards based on the Global Data Warehouse and Lakehouse environment. Collaborate closely with data engineers, data scientists, and business stakeholders to translate business requirements into technical BI solutions. Optimize Power BI datasets leveraging Azure Synapse Analytics and Databricks-processed data to ensure efficient query performance. Develop and maintain robust data models, DAX calculations, and custom visualizations in Power BI to deliver actionable insights. Implement best practices for report design, data security (Row-Level Security), and governance to ensure compliance and data integrity. Troubleshoot, debug, and resolve performance and data quality issues in Power BI reports and datasets. Mentor and provide technical guidance to junior BI developers and analysts. Stay up to date with the latest Power BI features and Azure Synapse ecosystem enhancements to continuously improve BI solutions. Support end-user training and documentation to promote self-service BI adoption. Required Qualifications Bachelor's or master's degree in computer science, Information Systems, Data Science, or a related field. 5+ years of experience in Power BI report development and data visualization. Strong proficiency in Power BI Desktop, Power Query (M), DAX, and Power BI Service. Hands-on experience with Azure Synapse Analytics, Azure Data Factory (ADF), and Databricks. Deep understanding of data warehousing concepts, dimensional modelling, and ETL/ELT processes. Experience optimizing performance of Power BI reports connected to large-scale data warehouses and lakehouses. Knowledge of security implementations within Power BI, including Row-Level Security (RLS) and workspace permissions. Strong SQL skills for data querying and debugging. Excellent problem-solving skills and ability to work effectively with cross-functional teams. Strong communication skills to engage with business users and technical teams. Preferred Qualifications Microsoft Power BI Certification (e.g., DA-100 / PL-300). Experience with other Azure data services (Azure Data Lake Storage, Azure Synapse Pipelines). Familiarity with Python or Spark for data processing in Databricks. Exposure to Agile development methodologies and CI/CD pipelines for BI. (ref:hirist.tech)

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title - Data Science Analyst S&C GN Management Level : Analyst Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary: We are seeking a highly skilled and motivated Data Science Analyst to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience: 1-5 years in data science Education: Bachelor's / master’s degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge: Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming: Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms: Experience with Azure / AWS / GCP Visualization Tools: PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Experience: 1-5 years in data science Educational Qualification: Bachelor's / master’s degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY_ Consulting_Microsoft_Fabric- Manager As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows. To qualify for the role, you should: Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems. Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability. Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets. Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use. Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query. Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights. Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives. Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth. Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes. Skills and attributes for success: 7-11 years of experience in developing data solutions using the Microsoft Azure cloud platform. Strong experience with Azure Data Factory and ETL Pipelines Strong experience with Azure Synapse, Data Warehouse and Lakehouse implementations Strong experience with Notebooks and Spark Background in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use. Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query. Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics. Ideally, you’ll also have: Exceptional communication skills and the ability to articulate ideas clearly and concisely. Capability to work independently as well as lead a team effectively. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 50 Lacs

Pune, Maharashtra, India

On-site

We are looking for a Senior Oracle EBS Technical Consultant to join our team. In this role, you will be responsible for providing technical expertise in Oracle E-Business Suite (EBS) and related technologies. You will work closely with clients to understand their business requirements and provide solutions to meet their needs. Responsibilities Collaborating with clients to gather and document technical requirements. Designing, developing, and implementing customizations and extensions to Oracle EBS. Providing technical support and troubleshooting for Oracle EBS. Conducting code reviews and ensuring best practices are followed. Leading technical workshops and training sessions for clients and internal team members. Keeping up to date with the latest technologies and industry trends in Oracle EBS. Experience in the preparation of Technical Design documents Can work independently and progress the build of a CEMLI/RICE object from a technical design document Technical Skills Hands on experience on Data Conversion/Migrations, Inbound / Outbound interfaces, Reports, Forms and Customizations. Experience in Implementation and RICE Customizations of Oracle Applications 11i/R12 Expertise in SQL, PL/SQL and Performance tuning. Expertise in Oracle Forms (Development and Personalization), BI Publisher Reports, Oracle Workflows. OAF experience will be preferable Sound knowledge in using Oracle APIs for interfaces to Oracle Financials and AOL/Sys-Admin components. Good knowledge on functional flows FIN (GL, Fixed Assets, Cash management, AP/AR) and SCM (procurement, Inventory, Order Management). Solid understanding of Oracle EBS database/table structures and the integration and impacts between modules Ability to design and document solutions for complex problem Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Minimum of 8-10 years of experience in Oracle EBS technical development. Strong technical skills in Oracle EBS R12 and related technologies such as Oracle Forms, Oracle Reports, PL/SQL, Oracle Workflow, OAF, ADF, and XML Publisher. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and in a team environment. Capable of working in a fast paced, dynamic, team-oriented environment. If you are a motivated individual with a passion for Oracle EBS technology and a desire to work in a dynamic, fast-paced environment, we encourage you to apply for this position. We offer competitive salary, comprehensive benefits, and opportunities for career growth and advancement. Skills:- Oracle EBS and Technical support

Posted 1 month ago

Apply

3.0 - 8.0 years

3 - 40 Lacs

Pune, Maharashtra, India

On-site

Responsibilities We are seeking an experienced Oracle EBS Technical Consultant to join our team. In this role, you will be responsible for providing technical expertise in Oracle E-Business Suite (EBS) and related technologies. You will work closely with clients to understand their business requirements and provide solutions to meet their needs. Your primary responsibilities will include: Designing, developing, and implementing customizations and extensions to Oracle EBS. Providing technical support and troubleshooting for Oracle EBS. Ensuring best practices are followed. Keeping up-to-date with the latest technologies and industry trends in Oracle EBS. Experience in the preparation of Technical Design documents Can work independently and progress the build of a CEMLI/RICE object from a technical design document Technical Skills Hands on experience on Data Conversion/Migrations, Inbound / Outbound interfaces, Reports, Forms and Customizations. Experience in Implementation and RICE Customizations of Oracle Applications 11i/R12 Expertise in SQL, PL/SQL and Performance tuning. Expertise in Oracle Forms (Development and Personalization), BI Publisher Reports, Oracle Workflows. OAF (Oracle Application Framework) experience will be preferable Sound knowledge in using Oracle APIs for interfaces to Oracle Financials and AOL/Sys-Admin components. Good knowledge on functional flows FIN (GL, Fixed Assets, Cash management, AP/AR) and SCM (procurement, Inventory, Order Management). Solid understanding of Oracle EBS database/table structures and the integration and impacts between modules Ability to design and document solutions for complex problem Qualifications Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Minimum of 3-8 years of experience in Oracle EBS technical development. Strong technical skills in Oracle EBS R12 and related technologies such as Oracle Forms, Oracle Reports, PL/SQL, Oracle Workflow, OAF, ADF, and XML Publisher. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and in a team environment. If you are a motivated individual with a passion for Oracle EBS technology and a desire to work in a dynamic, fast-paced environment, we encourage you to apply for this position. We offer competitive salary, comprehensive benefits, and opportunities for career growth and advancement. Skills:- Oracle EBS and Technical support

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Azure Data Engineer Experience Range: 5 to 10 years Location : Pune Skills: Azure Data Engineering, C#, Fabrics, ADF Synapse Python, SQL, Synapse/ADF Should have good hands on experience on building ETL flow using any ETL tool such as ADF etc. Should be able to understand the requirement and design the data flow diagram of ETL process end to end. Should have good hands-on experience writing complex SQL queries and advance concepts of SQL such has indexes,partition,filegroup,transaction etc. Should be able to understand the business requirement and develop end to end data pipelines using required tools/technologies. Excellent troubleshooting and good communication skills with good attention to details. Should have knowledge on designing optimized data processing based on volume of data. Able to create documentation that clearly explains the purpose of the data flow and its intended use. Able to make regular modifications to existing production code for error correction and adding new features. Experience using Visual Studio,SQL server management studio. Strong understanding of data warehousing concepts such as dimension,fact,schema ,data loading process, dimensional modeling and data mining. Flexible to learn and adopt tools/technologies used in project.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 40 Lacs

Dubai, Hyderabad

Work from Office

TechECS Hiring: Oracle EBS HRMS Technical Consultant Dubai! Location: Dubai, UAE (On-site) Experience: 7+ years Oracle EBS (11i & R12) technical experience, with strong HRMS (Core HR, Payroll, OTL) expertise. Key Responsibilities Design, develop, and implement Oracle EBS R12 solutions for HRMS (Core HR, Payroll, OTL). Create & maintain RICEW components (Reports, Interfaces, Conversions, Extensions, Workflows). Customize Oracle Forms, Reports, BI Publisher/XML, and OAF/ADF pages. Build and personalize Oracle Workflow & Advanced Workflow (AME). Develop robust PL/SQL: packages, procedures, triggers; tune performance. Implement Web ADI, XML, shell scripting, APIs, and interfaces for data integration. Lead technical assessments, debugging, system migrations (legacy R12), and production support. Mentor junior developers and document technical specifications and user guides. Collaborate with functional EHCM teams and business stakeholders to translate requirements. Optionally, handle WebServices/SOA integrations. Must-Have Technical Skills Oracle EBS 11i & R12: Core HR, Payroll, OTL Oracle PL/SQL (advanced queries, procedures, triggers) Oracle Forms & Reports, BI Publisher/XML Oracle Workflow / Advanced Workflow (AME) RICEW development Oracle Application Framework (OAF) & ADF Web ADI, XML, API/Interface development UNIX / Shell scripting Performance tuning and production debugging Why Join Tech ECS in Dubai? Work on high-impact Oracle EBS HRMS projects in a vibrant, international environment Competitive UAE-based compensation and benefits package Opportunity to lead and innovate in system customizations, migrations, and support If Interested, share your updated resume to mounika.paladugula@techecs.com Regards, Mounika Paladugula - TAG || Tech ECS Do checkout the Job opportunity in Linkedin - https://www.linkedin.com/posts/activity-7346065109640716288-x2JU?utm_source=share&utm_medium=member_desktop&rcm=ACoAABr1upkBsyV2seyDBpl4tvLVvGKgRzqTL44

Posted 1 month ago

Apply

15.0 - 17.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Area(s) of responsibility Experience: 15 to 17 Years Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies