Jobs
Interviews

1655 Adf Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Title: Consultant / Senior Consultant – Azure Data Engineering Location: India – Gurgaon preferred Industry: Insurance Analytics & AI Vertical Role Overview: We are seeking a hands-on Consultant / Senior Consultant with strong expertise in Azure-based data engineering to support end-to-end development and delivery of data pipelines for our insurance clients. The ideal candidate will have a deep understanding of Azure Data Factory, ADLS, Databricks (preferably with DLT and Unity Catalog), SQL, and Python and be comfortable working in a dynamic, client-facing environment. This is a key offshore role requiring both technical execution and solution-oriented thinking to support modern data platform initiatives. Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets: o Design and develop scalable and efficient data pipelines using Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS). o Build and maintain Databricks notebooks for data ingestion, transformation, and quality checks, using Python and SQL. o Work with Delta Live Tables (DLT) and Unity Catalog (preferred) to improve pipeline automation, governance, and performance. o Collaborate with data architects, analysts, and onshore teams to translate business requirements into technical specifications. o Troubleshoot data issues, ensure data accuracy, and apply best practices in data engineering and DevOps. o Support the migration of legacy SQL pipelines to modern Python-based frameworks. o Ensure adherence to data security, compliance, and performance standards, especially within insurance domain constraints. o Provide documentation, status updates, and technical insights to stakeholders as required. o Excellent communication skills and stakeholder management Required Skills & Experience: 3–7 years of strong hands-on experience in data engineering with a focus on Azure cloud technologies. Proficient in Azure Data Factory, Databricks, ADLS Gen2, and working knowledge of Unity Catalog. Strong programming skills in both SQL, Python especially within Databricks Notebooks. Pyspark expertise is good to have. Experience in Delta Lake / Delta Live Tables (DLT) is a plus. Good understanding of ETL/ELT concepts, data modeling, and performance tuning. Exposure to Insurance or Financial Services data projects is highly preferred. Strong communication and collaboration skills in an offshore delivery model. Required Skills & Experience: Experience working in Agile/Scrum teams Familiarity with Azure DevOps, Git, and CI/CD practices Certifications in Azure Data Engineering (e.g., DP-203) or Databricks Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

AZURE DATA ENGINEER Skills - Strong technical experience in Azure, SQL , Azure data factory, ETL, Databricks Graduation must Experience- 5-10 years CTC- Up to 14 - 20 LPA 21st June -F2F Interview only (Pune) Contact- 7742324144

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Title: Business Analyst and Business Intelligence Developer (Digital Solution Team)- Husky (India)Chennai Id: 20036 Type: FullTime Location: Chennai, India Job Description Job Purpose The DST Business Analyst and Business Intelligence Developer for Husky will be responsible for building the business intelligence system for the company, based on the internal and external data structures. Responsible for leading the design and support of enterprise-wide business intelligence applications and architecture. Works with enterprise-wide business and IT senior management to understand and prioritize data and information requirements. Solves complex technical problems. Optimizes the performance of enterprise business intelligence tools by defining data elements which contribute to data insights which add value to the user. Creates testing methodology and criteria. Designs and coordinates a curriculum for coaching and training customers in the use of business intelligence tools to enhance business decision-making capability. Develops standards, policies, and procedures for the form, structure, and attributes of the business intelligence tools and systems. Develops data/information quality metrics. Researches new technology and develops business cases to support enterprise-wide business intelligence solutions. Key Responsibilities & Key Success Metrics Leading BI software development, deployment and maintenance Perform Data Profiling and Data Analysis activities to understand data sources Report curation, template definition and analytical data modeling Work with cross-functional teams to gather and document reporting requirements Translate business requirements into specifications that will be used to implement the required reports and dashboards, created from potentially multiple data sources Identifies and resolves data reporting issues in a timely fashion, while looking for continuous improvement opportunities. Build solutions that create value and resolve business problems Provide technical guidance to designers and other stakeholders Work effectively with members of Digital Solutions Team Troubleshoots analytics tool problems and tunes for performance Develops semantic layer and analytics query objects for end users Translation of business questions and requirements into reports, views, and analytics query objects Ensuring that quality standards are met Supporting Master Data Management Strategy Qualifications Understanding of ERP and Operational systems databases, knowledge of database programming Highly skilled at writing SQL queries with large scale, complex datasets Experience in data visualization and data storytelling Experience designing, debugging and deploying software in ADO (Azure Dev/Ops) development environment Experience with Microsoft BI stack - Power BI and SQL Server Analysis Services Experience working in an international business environment Experience with Azure Data Platform resources (ADLS, ADF, Azure Synapse, Power BI Services) Basic manufacturing and sales business process knowledge Strong communication & presentation skills Ability to moderate meetings and constructive design sessions for effective decision making English language skills are a requirement, German & French are considered an asset Show more Show less

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Hyderabad, Chennai

Work from Office

Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. Key Responsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes Required Skills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities Nice To Have Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) Location Options: Hyderabad / Chennai (Remote flexibility available) Apply to: navaneeta@suzva.com Contact: 9032956160

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC4 Responsibilities Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the area - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalisation. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilise the organisational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organisational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self driven and result oriented Strong problem solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3/IC4/IC5 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role/ Designation : PowerBI Developer (Data Engineer) Experience : 3 Years+ in PowerBI Location : Hyderabad / Ahmedabad Role Objective We are looking for a highly motivated and experienced Senior Power BI Engineer to join our team of data experts. The ideal candidate will have a strong background in designing, developing, and maintaining PowerBi Dashboards & reports . As a Power BI Engineer, you will work closely with the Lead Data Engineer and Data Architect to implement end-to-end data solutions, build, and maintain data pipelines, and ensure the quality and integrity of our organization's data. Roles & Responsibilities Study, analyze and understand business requirements in context to business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power BI to build interactive and visually appealing dashboards and reports. Spot key performance indicators with apt objectives Analyze pervious and present data for better decision making Transform business requirements into technical publications Build multi-dimensional data models Develop strong data documentation about algorithms, parameters, models Perform detailed analysis on tested and deployed Power BI scripts Run DAX queries and functions in Power BI Define and design new systems Take care of data warehouse development Building Analysis Services reporting models. Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals. Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. Make essential technical and strategic changes to improvise present business intelligence systems. Identify the requirements and develop custom charts accordingly. SQL querying for better results Skills & Experience Required Bachelor's or Master's degree in Computer Science, Information Systems, or a related field Candidate must have minimum 3 years of hands-on experience on Power BI Desktop as well as Power BI Service. Preferred candidate with PL-300 Certification Must be proficient with DAX (Data Analysis Expressions). Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, PowerBI, and DAX Should be well versed with Power Query. Should have knowledge of SQL (Structured Query Language). Should be good with Data Modelling, and ETL Operations Should have experience on MSBI (Microsoft Business Intelligence) Stack - SSIS: SQL Server Integration Services,SSAS: SQL Server Analysis Services, SSRS: SQL Server Reporting Services. Experience working with Azure (ADF, Synapse, AAS). Expertise in Power BI Service Management. Proficient in doing advanced-level computations on the data set Excellent communication skills are required to communicate needs with client and internal teams successfully Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Hybrid

•Strong experience as an AWS/Azure/GCP Data Engineer & must have AWS/Azure/GCP Databricks experience. •Expert proficiency in Spark Scala, Python, spark,ADF & SQL •Design & develop applications on Databricks. NP-Immediate Email- sachin@assertivebs.com

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description We are seeking a highly skilled Data Architect to design, develop, and maintain end-to-end data architecture solutions, leveraging leading-edge platforms such as Snowflake , Azure , and Azure Data Factory (ADF) . The role involves translating complex business requirements into scalable, secure, and high-performance data solutions while enabling analytics, business intelligence (BI), and machine learning (ML) initiatives. Key Responsibilities Data Architecture & Design: Design and develop end-to-end data architectures for integration, storage, processing, and analytics using Snowflake and Azure services. Build scalable, reliable, and high-performing data pipelines to handle large volumes of data, utilizing Azure Data Factory (ADF) and Snowflake. Create and maintain data models (dimensional and relational) optimized for query performance and analytics using Azure Synapse Analytics and Azure Analysis Services (AAS). Define and implement data governance standards, data quality processes, and security protocols across all data solutions. Cloud Data Platform Management Architect and manage data solutions on Azure Cloud, ensuring seamless integration with services like Azure Blob Storage, Azure SQL, and Azure Synapse. Leverage Snowflake for data warehousing to ensure high availability, scalability, and performance. Design data lakes and data warehouses using Azure Synapse, creating architecture patterns for large-scale data storage and retrieval. Data Integration & ETL Development Lead the design and development of ETL/ELT pipelines using Azure Data Factory (ADF) to integrate data from various sources into Snowflake and other Azure-based data stores. Develop data transformation workflows using Python and ADF to process raw data into analytics-ready formats. Design and implement efficient ETL strategies using a combination of Python, ADF, and Snowflake. Analytics & Business Intelligence (BI) Design and implement data models for BI and reporting solutions using Azure Analysis Services (AAS) and Power BI. Create efficient data pipelines and aggregation strategies to support real-time and historical reporting across the organization. Implement best practices for data modeling to support business decision-making with tools like Power BI, AAS, and Synapse. Advanced Data Solutions (AI/ML Integration) Collaborate with data scientists and engineers to integrate machine learning (ML) and AI models into data pipeline architecture. Ensure that the data architecture is optimized for AI-driven insights and large-scale, real-time analytics. Collaboration & Stakeholder Engagement Work with cross-functional teams, including business analysts, data engineers, data scientists, and IT teams, to understand data requirements and align with business goals. Provide technical leadership, guiding development teams and ensuring adherence to architectural standards and best practices. Effectively communicate complex data architecture concepts to non-technical stakeholders, translating business needs into actionable solutions. Performance & Optimization Continuously monitor and optimize data solutions, ensuring fast, scalable data queries, transformations, and reporting functions. Troubleshoot and resolve performance bottlenecks in data pipelines and architecture, ensuring minimal downtime and high availability. Implement strategies for data archiving, partitioning, and optimization in Snowflake and Azure Synapse environments. Security & Compliance Design and implement robust security frameworks to protect sensitive data across Snowflake, Azure Synapse, and other cloud platforms. Ensure data privacy and compliance with industry regulations (e.g., GDPR, CCPA) through necessary security controls and access policies. Skills Snowflake, Azure databricks, Python Show more Show less

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Title: Data Engineer Job Summary Data Engineers will be responsible for the design, development, testing, maintenance, and support data assets including; Azure Data Lake and data warehouse development, modeling, package creation, SQL script creation, stored procedure development, integration services support among other responsibilities. Candidate have at least 3-5 years hands-on Azure experience as a Data Engineer, must be an expert in SQL and have extensive expertise building data pipelines. Candidate will be accountable for meeting deliverable commitments including schedule and quality compliance. This Candidate must have skills to plan and schedule own work activities, coordinate activities with other cross-functional team members to meet project goals. Basic Understanding Of Scheduling and workflow management & working experience in either ADF, Informatica, Airflow or Similar Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar Architecture and data modelling for Data Lake on cloud & working experience in Amazon WebServices (AWS), Microsoft Azure, Google Cloud Platform (GCP) Basic understanding of Build and Release management & working experience in Azure DevOps, AWS CodeCommitt or Similar Strong In Writing code in programming language & working experience in Python, PySpakrk, Scala or Similar Big Data Framework & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar Data warehouse working experience of concepts and development using SQL on single (SQL Server, Oracle or Similar) and parallel platforms (Azure SQL Data Warehouse or Snowflake) Code Management & working experience in GIT Hub, Azure DevOps or Similar End to End Architecture and ETL processes & working experience in ETL Tool or Similar Reading Data Formats & working experience in JSON, XML or Similar Data integration processes (batch & real time) using tools & working experience in either Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop or Similar Writing requirement, functional & technical documentation & working experience in Integration design document, architecture documentation, data testing plans or Similar SQL queries & working experience in SQL code or Stored Procedures or Functions or Views or Similar Database & working experience in any of the database like MS SQL, Oracle or Similar Analytical Problem Solving skills & working experience in resolving complex problems or Similar Communication (read & write in English), Collaboration & Presentation skills & working experience as team player or Similar Good To Have Stream Processing & working experience in either Databricks Streaming, Azure Stream Analytics or HD Insight or Kinesis Data Analytics or Similar Analytical Warehouse & working experience in either SQL Data Warehouse or Amazon Athena or AWS Redshift or Big Query or Similar Real-Time Store & working experience in either Azure Cosmos DB or Amazon Dynamo-DB or Cloud Bigdata or Similar Batch Ingestion & working experience in Data Factory or Amazon Kinesis or Lambda or Cloud Pub/Sub or Similar Storage & working experience in Azure Data Lake Storage GEN1/GEN2 or Amazon S3 or Cloud Storage or Similar Batch Data Processing & working experience in either Azure Databricks or HD Insight or Amazon EMR or AWS Glue or Similar Orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Summary We are seeking a highly skilled and experienced Databricks Developer to join our data engineering team. The ideal candidate will have over 4 years of experience working with Databricks on Azure, along with a strong background in data pipelines, and performance optimization. The candidate will be responsible for developing scalable data processing solutions, ensuring data quality, and enabling advanced analytics initiatives. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes using Azure Databricks (PySpark /Spark SQL). Collaborate with data architects, analysts, and other developers to deliver data solutions aligned with business requirements. Perform data wrangling, cleansing, transformation, and aggregation from multiple sources. Implement and maintain data lake and data warehouse solutions using Azure services (ADLS, Synapse, Delta Lake). Monitor pipeline performance, troubleshoot issues, and ensure data integrity and reliability. Use Delta Lake for building robust and scalable data architectures. Develop and maintain CI/CD pipelines for Databricks workflows and jobs. Participate in code reviews, unit testing, and documentation of data processes. Required Skills & Experience 4+ years of hands-on experience with Databricks on Azure. Strong expertise in PySpark, Spark SQL, and Delta Lake. Solid understanding of Azure Data Services: Azure Data Lake Storage (ADLS), Azure Data Factory (ADF) Proficiency in Python for data processing tasks. Experience with data ingestion from various sources (on-prem, cloud, APIs). Knowledge of data modeling, data governance, and performance tuning. Familiarity with CI/CD tools (Azure DevOps, Git) and job orchestration in Databricks. Strong problem-solving skills and ability to work independently or as part of a team. Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

3 - 6 Lacs

Hyderābād

On-site

SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Lead Programmer Analyst (Experience: 5 to 8 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Lead Programmer Analyst: At least 5 years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 6 Lacs

Hyderābād

On-site

SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Senior Programmer Analyst (Experience: 3 to 5 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Senior Programmer Analyst: Experience should have 3 years to 5 years of data engineering with SnowFlake and DBT. Experience (internships, academic projects, or entry-level roles) in designing and developing data pipelines. Exposure to Cloud data warehouse - Snowflake through coursework, projects, or training. Basic understanding of ETL tools like Azure Data Factory, Fivetran, or DBT (hands-on experience in academic settings or internships preferred). Familiarity with Git and Azure DevOps for version control and CI/CD processes. Understanding of Agile methodologies, Jira, and Confluence. Knowledge of SQL programming (views, functions, stored procedures); ability to write and optimize basic SQL queries. Exposure to Azure architecture and Data Lake concepts. Eager to learn, with a proactive approach to problem-solving. Ability to understand business requirements and ask the right questions to clarify tasks. Basic understanding of performance tuning and handling moderate-sized data transformations. Ability to create simple data flows and assist in solution design under guidance. Willingness to contribute to documentation (e.g., mapping, defect logs). Bachelor's degree in Computer Science, Statistics, or a related field. Strong foundational knowledge of SQL for querying databases. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL tools, or any data processing framework. Nice to have: Basic understanding of healthcare data and domain-specific concepts. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

Job Description Job Location: Hyderabad Job Duration: Full time Hours: 9:00am to 5:00pm We are seeking a hands-on Data Engineer with a strong focus on data ingestion to support the delivery of high-quality, reliable, and scalable data pipelines across our Data & AI ecosystem. This role is essential in enabling downstream analytics, machine learning, and business intelligence solutions by ensuring robust and automated data acquisition from various internal and external sources. Key Responsibilities Design, build, and maintain scalable and reusable data ingestion pipelines to onboard structured and semi-structured data from APIs, flat files, databases, and external systems. Work with Azure-native services (e.g., Data Factory, Azure Data Lake, Event Hubs) and tools like Databricks or Apache Spark for data ingestion and transformation. Develop and manage metadata-driven ingestion frameworks to support dynamic and automated onboarding of new sources. Collaborate closely with source system owners, analysts, and data stewards to define data ingestion specifications and implement monitoring/alerting on ingestion jobs. Ensure data quality, lineage, and governance principles are embedded into ingestion processes. Optimize ingestion processes for performance, reliability, and cloud cost efficiency. Support batch and real-time ingestion needs, including streaming data pipelines where applicable. Technical Experience 3+ years of hands-on experience in data engineering – bonus: with a specific focus on data ingestion or integration. Hands-on experience with Azure Data Services (e.g., ADF, Databricks, Synapse, ADLS) or equivalent cloud-native tools. Experience in Python (PySpark) for data processing tasks. (bonus: SQL knowledge) Experience with ETL frameworks, orchestration tools, and working with API-based data ingestion. Familiarity with data quality and validation strategies, including schema enforcement and error handling. Good understanding of CI/CD practices, version control, and infrastructure-as-code (e.g., Terraform, Git). Bonus: Experience with streaming ingestion (e.g., Kafka, Event Hubs, Spark Structured Streaming).

Posted 1 month ago

Apply

6.0 years

0 Lacs

India

On-site

Company Description Beyond Key specializes in driving Digital Transformation and Enterprise Modernization, leveraging deep technical expertise and AI capabilities. We serve industries such as Insurance, Non-Profit, Financial Services, Healthcare, and Manufacturing, focusing on customized growth and efficiency. Our commitment to delivering the right solutions has earned us prestigious awards, solidifying our position as a trusted technology partner. Recognized as a Great Place to Work, Beyond Key also boasts multiple awards for innovation, inclusivity, and excellence. We are dedicated to redefining possibilities with technology and innovation to help clients achieve their digital goals. Experience: 6+ years preferred Job Summary We’re looking for a hands-on Azure DevOps & Data Engineer who can bridge the gap between platform automation and data engineering. You’ll work on automating and optimizing our Azure data pipelines and deployments using Azure DevOps, Logic Apps, Data Factory, and SQL-based solutions. The role requires strong command over T-SQL and experience managing workflows and releases in a modern Azure setup. Key Responsibilities Azure DevOps - Build and maintain CI/CD pipelines for deploying ADF, SQL scripts, Logic Apps, and other data components. - Manage Azure DevOps Repos, Pipelines, and Releases for consistent deployments. - Set up deployment automation and rollback mechanisms across dev, test, and prod. Azure Data Services - Design and manage data pipelines using Azure Data Factory (ADF) — linked services, triggers, and parameterized workflows. - Develop and maintain Azure SQL Database and Azure SQL Managed Instance objects. - Leverage Azure Logic Apps to orchestrate workflows, alerting, approvals, and integrations with other systems. SQL - Write and optimize complex SQL queries, stored procedures, and functions. - Perform query tuning, indexing, and data integrity checks. - Work with large datasets and troubleshoot performance issues. Monitoring & Maintenance - Set up monitoring and alerting using Azure Monitor, Log Analytics, or custom alerts in ADF and Logic Apps. - Handle data job failures, pipeline errors, and CI/CD release troubleshooting. Collaboration & Documentation - Collaborate with data analysts, business users, and platform engineers. - Maintain up-to-date documentation of pipeline workflows, release notes, and known issues. Required Skills - Solid experience with Azure DevOps (Pipelines, Repos, Releases). - Hands-on expertise in Azure Data Factory, Azure Logic Apps, Azure SQL Database, and SQL Managed Instance. - Strong command over SQL (SPs, UDFs, performance tuning, query plans). - Good understanding of Git-based source control and branching models. - Experience in troubleshooting integration flows and ETL/ELT processes. Nice-to-Have (Not Mandatory) - Exposure to Power BI, Data Lake. - Basic scripting in PowerShell or Python. - Understanding of RBAC, resource tagging, and cost monitoring in Azure. Soft Skills - Strong analytical and debugging skills. - Proactive communicator and collaborator. - Able to handle multiple deployments and shifting priorities. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Are you passionate about building resilient, scalable, and secure cloud platforms? Join the Platform Engineering team at Xebia , where we are transforming enterprise data landscapes with cutting-edge cloud-native architectures and DevOps-driven delivery. This role is ideal for engineers who thrive at the intersection of Python, Azure, Big Data, and DevOps , and are ready to lead by design and automation. What You’ll Do: Design, build, and automate robust cloud platforms on Azure Enable data-driven architectures using Azure PaaS and Cloudera stack Ensure performance, security, and reliability across scalable systems Drive infrastructure automation and deployment with modern DevOps tooling Collaborate with cross-functional teams to deliver platform solutions at scale Your Tech Superpowers: We’re looking for engineers with hands-on expertise in: 🔹 Programming & Platform Services: Python Azure PaaS: Event Hub, ADF, Azure Functions, Databricks, Synapse, Cosmos DB, ADLS Gen2 🔹 Cloud & Big Data: Microsoft Azure Cloudera ecosystem 🔹 Data Tools & Visualizations: Dataiku, Power BI, Tableau 🔹 DevOps & Infrastructure as Code (IaC): CI/CD, GitOps, Terraform Docker & Kubernetes 🔹 Security & Networking: Inter-service communication & resilient Azure architecture Identity & Access Management (IAM) 🔹 Ways of Working: Agile mindset Clean code, automation-first, test-driven practices Why Join Xebia? Competitive compensation & world-class benefits Work with global clients on modern engineering challenges Upskill through structured learning, certifications & mentorship A culture built on trust, innovation & ownership Freedom to build, lead, and grow without limits Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Xylem is a Fortune 500 global water solutions company dedicated to advancing sustainable impact and empowering the people who make water work every day. As a leading water technology company with 23,000 employees operating in over 150 countries, Xylem is at the forefront of addressing the world's most critical water challenges. We invite passionate individuals to join our team, dedicated to exceeding customer expectations through innovative and sustainable solutions. As a Data Engineer, you will design, develop, and optimize scalable data pipelines and workflows to support advanced analytics and business intelligence needs. You will collaborate with cross-functional teams to ensure data accessibility, integrity, and security. Core Responsibilities Design, develop, and implement robust data pipelines for data collection, transformation, and integration. Collaborate with senior engineers to architect scalable data solutions using Azure services, including Azure Data Factory and Databricks. Integrate data from SAP ERP systems and other enterprise platforms into modern cloud-based data ecosystems. Leverage Databricks for big data processing and workflow optimization. Work with stakeholders to understand data requirements, ensuring data quality and consistency. Maintain data governance practices to support compliance and security protocols. Support analytics teams by providing well-structured, reliable data for reporting and machine learning projects. Troubleshoot and resolve data pipeline and workflow issues. Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 3–5 years of experience in data engineering or a related role. Proficiency in Azure technologies, including Azure Data Factory, Azure SQL Database, and Databricks. Experience with SAP data integration is a plus. Strong SQL and Python programming skills for data engineering tasks. Familiarity with data modeling concepts (e.g., star and snowflake schemas) and best practices. Experience with CI/CD pipelines for deploying data workflows and infrastructure. Knowledge of distributed file systems like Azure Data Lake or equivalent cloud storage solutions. Basic understanding of Apache Spark for distributed data processing. Strong problem-solving skills and a collaborative mindset. Technical Knowledge Deep understanding of Azure cloud infrastructure and services, particularly those related to data management (e.g., Azure Data Lake, Azure Blob Storage, Azure SQL Database). Experience with Azure Data Factory (ADF) for orchestrating ETL pipelines and automating data workflows. Familiarity with Azure Databricks for big data processing, machine learning, and collaborative analytics. Expertise in Apache Spark for distributed data processing and large-scale analytics. Familiarity with Databricks, including managing clusters and optimizing performance for big data workloads. Understanding of Databricks Bronze, Silver, and Gold Model. Understanding of distributed file systems like HDFS and cloud-based equivalents like Azure Data Lake. Proficiency in SQL and NoSQL databases, including designing schemas, query optimization, and managing large datasets. Experience with data warehousing solutions like Databricks, Azure Synapse Analytics or Snowflake. Familiarity with connecting data Lakehouse’s with Power BI. Understanding of OLAP (Online Analytical Processing) and OLTP (Online Transaction Processing) systems. Strong grasp of data modeling techniques, including conceptual, logical, and physical data models. Experience with star schema, snowflake schema, and normalization for designing scalable, performant databases. Knowledge of data architecture best practices, ensuring efficient data flow, storage, and retrieval. Knowledge of CI/CD pipelines for automating the deployment of data pipelines, databases, and infrastructure. Experience with infrastructure as code tools like Terraform or Azure Resource Manager to manage cloud resources. Preferred Qualifications Familiarity with tools like Apache Airflow or other workflow orchestration tools. Knowledge of Azure Monitor or similar tools for system performance tracking. Certifications in Azure Data Engineering or related cloud platforms. Join the global Xylem team to be a part of innovative technology solutions transforming water usage, conservation, and re-use. Our products impact public utilities, industrial sectors, residential areas, and commercial buildings, with a commitment to providing smart metering, network technologies, and advanced analytics for water, electric, and gas utilities. Partner with us in creating a world where water challenges are met with ingenuity and dedication; where we recognize the power of inclusion and belonging in driving innovation and allowing us to compete more effectively around the world. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

In, Tandjilé, Chad

On-site

Job Description Job Title – Azure Data Engineer Candidate Specification – 5+ years, Notice Period – Immediate to 30 days, Hybrid. Job Description Strong in Azure Data Factory (ADF), Azure Databricks. Experience in Azure Synapse Analytics, Azure Data Lake Storage (Gen2). Data Abse experience - Azure SQL Database / SQL Server. Proficiency in writing complex SQL queries and working with large datasets. Experience with Python, Scala, PySpark for data transformations. Knowledge of DevOps practices and tools (e.g., Azure DevOps, CI/CD for data pipelines). Skills Required RoleAzure Data Engineer Industry TypeIT/ Computers - Software Functional Area Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills AZURE DATAFACTORY AZURE DATABRICKS PYTHON Other Information Job CodeGO/JC/186/2025 Recruiter Name Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Cloud Visual Builder Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to guarantee that the applications function seamlessly within the business environment, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to be an SME. - Analyze requirements, determine technical level of effort and prepare technical design and specifications. - Conversant in deploying and troubleshooting, analyzing, and resolving technical problems - Conduct Design review to provide guidance and Quality assurance around best practices and frameworks Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Cloud Visual Builder. - Overall 4+ years of experience in Web App development (Oracle ADF) - 2 to 3 years of experience in Oracle VBCS (Visual Builder Cloud Service) - Knowledge of Oracle database and PL/SQL - Experience in GIT-HUB, Oracle Developer Cloud and UCD tools for build and deployment - Good hands on knowledge in JavaScript, CSS3, XML/JSON/WSDL, Consuming Web Services(SOAP/REST), Testing Tools(Postman/SoapUI/JMeter) -Experience with building different types of application in VBCS using Business Object, ORDS -Knowledge and experience in integration with other Oracle PaaS services. -Experience with integrating VBCS applications with Oracle SaaS Applications -Work experience on development of SaaS extensions using VBCS -Experience of various web service related technologies such as WSDL/XML/SOAP/REST/JSON standards -Hands on writing SQL Queries -Good communication interpersonal skills. Good analytical and debugging skills Additional Information: - The candidate should have minimum 4 years of experience in Oracle Cloud Visual Builder. - A 15 years full time education is required. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Data Engineer Location: Hyderabad, Kochi, Trivandrum Experience Required: 10-19 Yrs Skills: Primary - Scala, Pyspark, Python / Secondary - ETL, SQL, Azure Role Proficiency The role demands expertise in building robust, scalable data pipelines that support ingestion, wrangling, transformation, and integration of data from multiple sources. The ideal candidate should have hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, GCP DataProc), and strong programming skills in Python, PySpark, SQL, and optionally Scala. Proficiency across various data domains and familiarity with modern data warehouse and lakehouse architectures (Snowflake, BigQuery, Delta Lake, Lakehouse) is essential. A solid understanding of DevOps and infrastructure cost optimization is required. Key Responsibilities & Outcomes Technical Development Develop high-performance data pipelines and applications. Optimize development using design patterns and reusable solutions. Create and tune code using best practices for performance and scalability. Develop schemas, data models, and data storage solutions (SQL/NoSQL/Delta Lake). Perform debugging, testing, and validation to ensure solution quality. Documentation & Design Produce high-level and low-level design (HLD, LLD, SAD) and architecture documentation. Prepare infra costing, source-target mappings, and business requirement documentation. Contribute to and govern documentation standards/templates/checklists. Project & Team Management Support Project Manager in planning, delivery, and sprint execution. Estimate effort and provide input on resource planning. Lead and mentor junior team members, define goals, and monitor progress. Monitor and manage defect lifecycle including RCA and proactive quality improvements. Customer Interaction Gather and clarify requirements with customers and architects. Present design alternatives and conduct product demos. Ensure alignment with customer expectations and solution architecture. Testing & Release Design and review unit/integration test cases and execution strategies. Provide support during system/integration testing and UAT. Oversee and execute release cycles and configurations. Knowledge Management & Compliance Maintain compliance with configuration management plans. Contribute to internal knowledge repositories and reusable assets. Stay updated and certified on relevant technologies/domains. Measures of Success (KPIs) Adherence to engineering processes and delivery schedules. Number of post-delivery defects and non-compliance issues. Reduction in recurring defects and faster resolution of production bugs. Timeliness in detecting, responding to, and resolving pipeline/data issues. Improvements in pipeline efficiency (e.g., runtime, resource utilization). Team engagement and upskilling; completion of relevant certifications. Zero or minimal data security/compliance breaches. Expected Deliverables Code High-quality data transformation scripts and pipelines. Peer-reviewed, optimized, and reusable code. Documentation Design documents, technical specifications, test plans, and infra cost estimations. Configuration & Testing Configuration management plans and test execution results. Knowledge Sharing Contributions to SharePoint, internal wikis, client university platforms. Skill Requirements Mandatory Technical Skills Languages : Python, PySpark, Scala ETL Tools : Apache Airflow, Talend, Informatica, AWS Glue, Databricks, DataProc Cloud Platforms : AWS, GCP, Azure (esp. BigQuery, DataFlow, ADF, ADLS) Data Warehousing : Snowflake, BigQuery, Delta Lake, Lakehouse architecture Performance Tuning : For large-scale distributed systems and pipelines Additional Skills Experience in data model design and optimization. Good understanding of data schemas, window functions, and data partitioning strategies. Awareness of data governance, security standards, and compliance. Familiarity with DevOps, CI/CD, infrastructure cost estimation. Certifications (Preferred) Cloud certifications (e.g., AWS Data Analytics, GCP Data Engineer) Informatica or Databricks certification Domain-specific certifications based on project/client need Soft Skills Strong analytical and problem-solving capabilities Excellent communication and documentation skills Ability to work independently and collaboratively in cross-functional teams Stakeholder management and customer interaction Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies