Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
6 - 22 Lacs
Noida
On-site
Job Title: ETL Lead – Azure Data Factory (ADF) Department : Data Engineering / Analytics Employment Type : Full-time Experience Level : 5+ years About the Role We are seeking an experienced ETL Lead with strong expertise in Azure Data Factory (ADF) to lead and oversee data ingestion and transformation projects across the organization. The role demands a mix of technical proficiency and leadership to design scalable data pipelines, manage a team of engineers, and collaborate with cross-functional stakeholders to ensure reliable data delivery. Key Responsibilities Lead the design, development, and implementation of end-to-end ETL pipelines using Azure Data Factory (ADF). Architect scalable ingestion solutions from multiple structured and unstructured sources (e.g., SQL Server, APIs, flat files, cloud storage). Define best practices for ADF pipeline orchestration, performance tuning, and cost optimization. Mentor, guide, and manage a team of ETL engineers—ensuring high-quality deliverables and adherence to project timelines. Work closely with business analysts, data modelers, and source system owners to understand and translate data requirements. Establish data quality checks, monitoring frameworks, and alerting mechanisms. Drive code reviews, CI/CD integration (using Azure DevOps), and documentation standards. Own delivery accountability across multiple ingestion and data integration workstreams. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or related discipline. 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Deep understanding of data ingestion, transformation, and warehousing best practices. Strong SQL skills and experience with cloud-native data storage (ADLS Gen2, Blob Storage). Proficiency in orchestrating complex data flows, parameterized pipelines, and incremental data loads. Experience in handling large-scale data migration or modernization projects. Preferred Skills Familiarity with modern data platforms like Azure Synapse, Snowflake, Databricks. Exposure to Azure DevOps pipelines for CI/CD of ADF pipelines and linked services. Understanding of data governance, security (RBAC), and compliance requirements. Experience leading Agile teams and sprint-based delivery models. Excellent communication, leadership, and stakeholder management skills. Job Type: Full-time Pay: ₹671,451.97 - ₹2,218,396.67 per year Benefits: Health insurance Schedule: Day shift Application Question(s): 5+ years of hands-on ETL development experience, with 3+ years of experience in Azure Data Factory. Experience: ETL: 5 years (Required) Azure: 3 years (Required) Work Location: In person
Posted 2 weeks ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 – Software Engineering Outcomes Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 – Software Engineering and Developer 2 – Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 – Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Code Outputs Expected: Develop code independently for the above Configure Implement and monitor configuration process Test Create and review unit test cases scenarios and execution Domain Relevance Develop features and components with good understanding of the business problem being addressed for the client Manage Project Manage module level activities Manage Defects Perform defect RCA and mitigation Estimate Estimate time effort resource dependence for one's own work and others' work including modules Document Create documentation for own work as well as perform peer review of documentation of others' work Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Status Reporting Report status of tasks assigned Comply with project related reporting standards/process Release Execute release process Design LLD for multiple components Mentoring Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments Resource needs to have sound technical knowhow on Azure Data bricks, has knowledge in SQL querying, and managing data bricks notebooks. Hands on experience on SQL using SQL constraints, operators, modifying and querying data from the table. Skills Azure Databricks,Adf,Sql Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: ETL Ingestion Engineer (Azure Data Factory) Department : Data Engineering / Analytics Employment Type : Full-time Experience Level : 2–5 years Count: ETL Ingestion Engr - 1 | Sr. ETL Ingestion Engr - 1 About the Role We are looking for a talented Data Engineer with hands-on experience in Azure Data Factory (ADF) to join our Data Engineering team. An individual will be responsible for building, orchestrating, and maintaining robust data ingestion pipelines from various source systems into our data lake or data warehouse environments. Key Responsibilities Design and implement scalable data ingestion pipelines using Azure Data Factory (ADF) . Extract data from a variety of sources such as SQL Server, flat files, APIs, and cloud storage. Develop ADF pipelines and data flows to support both batch and incremental loads. Ensure data quality, consistency, and reliability throughout the ETL process. Optimize ADF pipelines for performance, cost, and scalability. Monitor pipeline execution, troubleshoot failures, and ensure data availability meets SLAs. Document pipeline logic, source-target mappings, and operational procedures. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2+ years of experience in ETL development and data pipeline implementation. Strong hands-on experience with Azure Data Factory (ADF) , including linked services, datasets, pipelines, and triggers. Proficiency in SQL and working with structured and semi-structured data (CSV, JSON, Parquet). Experience with Azure storage systems (ADLS Gen2, Blob Storage) and data movement. Familiarity with job monitoring and logging mechanisms in Azure. Preferred Skills Experience with Azure Data Lake, Synapse Analytics, or Databricks. Exposure to Azure DevOps for CI/CD in data pipelines. Understanding of data governance, lineage, and compliance requirements (GDPR, HIPAA, etc.). Knowledge of RESTful APIs and API-based ingestion. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Role Overview We are looking for a highly skilled Senior Data Engineer with strong hands-on experience in Azure Databricks, PySpark, and SQL , who can creatively design and develop high-performance data applications. The ideal candidate should be capable of optimizing solutions, mentoring team members, and collaborating with stakeholders to deliver quality software aligned with business goals. Roles & Responsibilities Technical Development Develop application components based on design specifications (HLD/LLD). Code, debug, test, and document the development lifecycle. Ensure adherence to coding standards and reuse best practices and design patterns. Optimize for performance, scalability, and cost efficiency. Testing & Quality Create and review unit test cases, test scenarios, and execution plans. Collaborate with QA teams to review test plans and provide technical clarifications. Conduct root cause analysis (RCA) for defects and implement mitigation strategies. Design & Architecture Contribute to the design and architecture of components/features (HLD, LLD, SAD). Participate in interface and data model design discussions. Evaluate and suggest technical options including reuse, reconfiguration, or new implementation. Project & Delivery Manage and deliver assigned modules/user stories within agile sprints. Provide effort estimates and assist in sprint planning. Execute and monitor the release and deployment process. Documentation & Configuration Create/review development checklists, design templates, and technical documentation. Define and enforce configuration management practices. Domain Understanding Understand business needs and translate them into technical requirements. Advise developers by applying domain knowledge to product features. Pursue domain certifications to enhance project contributions. Team & Stakeholder Management Set and review FAST goals for self and team members. Provide mentorship and address people-related issues in the team. Interface with customers to gather requirements, present solutions, and conduct demos. Maintain motivation and positive dynamics within the team. Knowledge Management Contribute to internal documentation repositories and client-specific knowledge bases. Review and promote reusable components and best practices. Must-Have Skills Azure Databricks – Hands-on experience with Spark-based analytics on Azure. PySpark – Proficient in developing and optimizing large-scale ETL pipelines. SQL – Strong command of writing, optimizing, and troubleshooting complex queries. Agile Methodology – Experience working in Scrum/Kanban environments. Software Development Life Cycle (SDLC) – Solid understanding from requirement to release. Experience in effort estimation, defect triage, and client communication. Good-to-Have Skills Azure Data Lake, Azure Data Factory, Synapse Analytics CI/CD pipelines using Azure DevOps or GitHub Actions Data modeling and Interface definition techniques Domain experience in healthcare, finance, or other data-heavy industries Experience with unit testing frameworks (e.g., PyTest) Familiarity with business intelligence tools (Power BI, Tableau) Certifications (Preferred) Microsoft Certified: Azure Data Engineer Associate Databricks Certified Developer – Apache Spark using Python Agile/Scrum Certification (CSM, PSM, etc.) Soft Skills Strong analytical and problem-solving abilities Ability to work under pressure and handle multiple tasks Excellent communication and client-facing skills Proactive attitude and team mentorship capabilities Ability to drive technical discussions and solution demos Skills Adb,Adf,Mongodb,Pyspark Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Years of exp- 8-10 years Role Proficiency Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments We are seeking an experienced Oracle SCM Cloud Functional / Technical Consultant with strong techno-functional expertise in Oracle EBS SCM modules. The ideal candidate will play a key role in bridging the gap between business users and technology, enabling smooth implementation, migration, and support of Oracle SCM applications (Cloud and/or EBS). Key Responsibilities: Analyze business requirements and translate them into functional and technical solutions in Oracle SCM Cloud and/or Oracle E-Business Suite (EBS). Configure and support Oracle SCM Cloud modules such as Procurement, Inventory, Order Management, Product Hub, Manufacturing, etc. Work on data migration, integrations, and extensions using Oracle tools (e.g., FBDI, ADF, REST/SOAP APIs, BI Publisher). Collaborate with cross-functional teams during implementations, upgrades, and support phases. Provide techno-functional support, including issue resolution, patch testing, and performance optimization. Develop technical components such as interfaces, reports, conversions, and customizations. Assist in UAT, training, and documentation for end-users. Stay up to date with Oracle SCM Cloud updates and EBS R12 enhancements. Required Skills & Experience: 5+ years of experience with Oracle EBS SCM modules (R12.1/12.2). At least 1–2 years of hands-on experience in Oracle SCM Cloud preferred. Strong knowledge of modules like PO, INV, OM, WMS, BOM, ASCP, iProcurement, etc. Proficiency in SQL, PL/SQL, Oracle Workflow, BI Publisher, Web ADI, and Oracle Forms/Reports. Experience with integration tools such as Oracle Integration Cloud (OIC) or middleware (SOA, Dell Boomi, MuleSoft) is a plus. Ability to conduct requirement gathering sessions, CRP, SIT, and UAT. Excellent problem-solving, communication, and interpersonal skills. Preferred Qualifications: Oracle Cloud SCM or Oracle EBS certifications. Experience with Agile / Waterfall methodologies. Exposure to global implementation/support environments. Familiarity with DevOps practices for Oracle deployments. Skills Oracle Scm,Oracle EBS Functional,Oracle Ebs Technical Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description DigitalXgeeks specializes in scaling business revenue and helping startups by optimizing all the departments through innovative digital solutions. The company's mission is to empower businesses by providing advanced technological support and expertise. DigitalXgeeks is committed to fostering growth and efficiency in a dynamic work environment. Role Description This is a Freelance/Part-time role for a Senior Data Engineer (Training) at DigitalXgeeks, with good experience on below skills Azure Data Engineer + Databricks Developer ➢ Python ➢ DataBricks ➢ Spark ➢ Pyspark ➢ Spark SQL ➢ Delta Lake ➢ Azure Data Factory ➢ Azure ADF & Databricks Projects ➢ Power BI Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Thane, Maharashtra
On-site
202503666 Thane, Maharashtra, India Bevorzugt Description Experience: 4+ years of experience Key Responsibilities Help define/improvise actionable and decision driving management information Ensure streamlining, consistency and standardization of MI within the handled domain Build and operate flexible processes/reports that meet changing business needs Would be required to prepare the detailed documentation of schemas Any other duties commensurate with position or level of responsibility Desired Profile Prior experience in insurance companies/insurance sector would be an added advantage Experience of Azure Technologies (include SSAS, SQL Server, Azure Data Lake, Synapse) Hands on experience on SQL and PowerBI. Excellent understanding in developing stored procedures, functions, views and T-SQL programs. Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using ADF to extract the data from multiple source systems. Analyze existing SQL queries for performance improvements. Excellent written and verbal communication skills Ability to create and design MI for the team Expected to handle multiple projects, with stringent timelines Good interpersonal skills Actively influences strategy by creative and unique ideas Exposure to documentation activities Key Competencies Technical Learning - Can learn new skills and knowledge, as per business requirement Action Orientated - Enjoys working; is action oriented and full of energy for the things he/she sees as challenging; not fearful of acting with a minimum of planning; seizes more opportunities than others. Decision Quality - Makes good decisions based upon a mixture of analysis, wisdom, experience, and judgment. Detail Orientation : High attention to detail, especially in data quality, documentation, and reporting Communication & Collaboration : Strong interpersonal skills, effective in team discussions and stakeholder interactions Qualifications B.Com/ BE/BTech/MCA from reputed College/Institute
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
Job Summary: We are looking for an experienced Database & Data Engineer who can own the full lifecycle of our cloud data systems—from database optimization to building scalable data pipelines. This hybrid role demands deep expertise in SQL performance tuning , cloud-native ETL/ELT , and modern Azure data engineering using tools like Azure Data Factory, Databricks, and PySpark . Ideal candidates will be comfortable working across Medallion architecture , transforming raw data into high-quality assets ready for analytics and machine learning. Key Responsibilities: 🔹 Database Engineering Implement and optimize indexing, partitioning, and sharding strategies to improve performance and scalability. Tune and refactor complex SQL queries, stored procedures, and triggers using execution plans and profiling tools. Perform database performance benchmarking, query profiling , and resource usage analysis. Address query bottlenecks, deadlocks, and concurrency issues using diagnostic tools and SQL optimization. Design and implement read/write splitting and horizontal/vertical sharding for distributed systems. Automate backup, restore, high availability, and disaster recovery using native Azure features. Maintain schema versioning and enable automated deployment via CI/CD pipelines and Git . 🔹 Data Engineering Build and orchestrate scalable data pipelines using Azure Data Factory (ADF), Databricks , and PySpark . Implement Medallion architecture with Bronze, Silver, and Gold layers in Azure Data Lake. Process and transform data using PySpark, Pandas, and NumPy . Create and manage data integrations from REST APIs , flat files, databases, and third-party systems. Develop and manage incremental loads , SCD Type 1 & 2 , and advanced data transformation workflows . Leverage Azure services like Synapse, Azure SQL DB, Azure Blob Storage , and Azure Data Lake Gen2 . Ensure data quality, consistency, and lineage across environments. 🔹 Collaboration & Governance Work with cross-functional teams including data science, BI, and business analysts. Maintain standards around data governance, privacy, and security compliance . Contribute to internal documentation and team knowledge base using tools like JIRA, Confluence, and SharePoint . Participate in Agile workflows and help define sprint deliverables for data engineering tasks. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 5+ years of hands-on experience in data engineering and SQL performance optimization in cloud environments. Expertise in Azure Data Factory, Azure Data Lake, Azure SQL, Azure Synapse , and Databricks . Proficient in SQL, Python, PySpark, Pandas, and NumPy . Strong experience in query performance tuning, indexing, and partitioning . Familiar with PostgreSQL (PGSQL) and handling NoSQL databases like Cosmos DB or Elasticsearch . Experience with REST APIs , flat files, and real-time integrations. Working knowledge of version control (Git) and CI/CD practices in Azure DevOps or equivalent. Solid understanding of Medallion architecture , lakehouse concepts, and data reliability best practices. Preferred Qualifications: Microsoft Certified: Azure Data Engineer Associate or equivalent. Familiarity with Docker, Kubernetes , or other containerization tools. Exposure to streaming platforms such as Kafka, Azure Event Hubs, or Azure Stream Analytics. Industry experience in supply chain, logistics, or finance is a plus. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors / 4 year university degree Experience - Between 5+ years of experience Development experience on Azure and Databricks Hands-on experience in developing ETL pipeline using Databricks and ADF(Azure Data Factory) Hands-on experience in Python, PySpark, Spark SQL Hands-on experience in on-premises environments to Azure Cloud Azure cloud exposure(Azure services like Virtual Machines, Load Balancer, SQL Database, Azure DNS, Blob Storage, AzureAD etc.) Good to have hands-on experience on Bigdata platform (Hadoop, Hive) and SQL scripting Good to have experience of Scala, Snowflake, Healthcare domain Good to have experience with CI/CD tools such as GitHub Action Ability to develop innovative approaches on performance optimization & automation Proven excellent verbal communication and presentation skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Responsibilities Qualifications Bachelor’s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus – Builds strong relationships and delivers customer-centric solutions. Global Perspective – Applies a broad, global lens to problem-solving. Manages Complexity – Navigates complex information to solve problems effectively. Manages Conflict – Resolves conflicts constructively and efficiently. Optimizes Work Processes – Continuously improves processes for efficiency and effectiveness. Values Differences – Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration – Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis – Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing – Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling – Optimizes application and database performance. Qualifications Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3–5 years minimum). Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415054 Relocation Package No Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Responsibilities Qualifications Bachelor’s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus – Builds strong relationships and delivers customer-centric solutions. Global Perspective – Applies a broad, global lens to problem-solving. Manages Complexity – Navigates complex information to solve problems effectively. Manages Conflict – Resolves conflicts constructively and efficiently. Optimizes Work Processes – Continuously improves processes for efficiency and effectiveness. Values Differences – Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration – Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis – Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing – Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling – Optimizes application and database performance. Qualifications Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3–5 years minimum). Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415084 Relocation Package No Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. About UHG United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibilities Gather, analyze and document business requirements. At the same time leveraging knowledge of claims, clinical and other healthcare systems ETL jobs Development using Talend, Python, Cloud Based Data-Warehouse, Jenkins, Kafka and an orchestration tool Writing advanced SQL queries Create and interpret functional and technical specifications and design documents Understand the business and how various data elements and subject areas are utilized in order to develop and deliver the reports to business Be an SME either on Claims, member or provider module Provide regular status updates to higher management Design, develop, and implement scalable and high-performing data models and solutions using Snowflake and Oracle Manage and optimize data replication and ingestion processes using Oracle and Snowflake Develop and maintain ETL pipelines using Azure Data Factory (ADF) and Databricks Optimize query performance and reduce latency by leveraging pre-aggregated tables and efficient data processing techniques Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions Implement data security measures and ensure compliance with industry standards Automate data governance and security controls to maintain data integrity and compliance Develop and maintain comprehensive documentation for data architecture, data flows, ETL processes, and configurations Continuously optimize the performance of data pipelines and queries to improve efficiency and reduce costs Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors or 4 year university degree 5+ years of experience Experience in developing ETL jobs using Snowflake, ADF , Databricks and Python Experience in writing efficient and advanced SQL queries Experience in both producing and consuming data utilizing Kafka Experience working on large scale cloud-based data warehouse- Snowflake, Databricks Good experience in building data pipelines using ADF Knowledge of Agile methodologies, roles, responsibilities and deliverables Proficiency in Python for data processing and automation Demonstrated ability to learn and adapt to new data technologies Preferred Qualifications Certified in Azure Data Engineering (AZ-205) Extensive experience with Azure cloud services (Azure Data Factory, Azure Databricks, Azure SQL Database, etc.) Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI/CD) Knowledge of SQL and NoSQL databases Proficiency in Python for data processing and automation Proven excellent time management, communication, decision making, and presentation skills Proven good problem-solving skills Proven good communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy. Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration. Develop and optimize complex SQL queries and Python-based data transformation logic. Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes. Automate deployment of data pipelines using CI/CD practices in Azure DevOps. Ensure data quality, security, and compliance with best practices. Monitor and troubleshoot performance issues in data pipelines. Collaborate with cross-functional teams to define data requirements and strategies. Requirements To be successful in this role, you should meet the following requirements: 5+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL. Hands-on experience with Prophesy for data pipeline development. Proficiency in Python for data processing and transformation. Experience with Apache Airflow for workflow orchestration. Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes. Familiarity with GitHub and Azure DevOps for version control and CI/CD automation. Solid understanding of data modelling, warehousing, and performance optimization. Ability to work in an agile environment and manage multiple priorities effectively. Excellent problem-solving skills and attention to detail. Experience with Delta Lake and Lakehouse architecture. Hands-on experience with Terraform or Infrastructure as Code (IaC). Understanding of machine learning workflows in a data engineering context. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description The role is based in Pune, India. Compensation to be paid in INR. Defining and developing data sets, models and cubes Build simple to complex pipelines & data flows in ADF which includes extracting data from the source system into the data warehouse staging area, ensuring data validation, data accuracy, data type conversion, and business rule application. Strong Visualization skills in PBI , expert knowledge in writing advanced DAX in Power BI, Power Queries , Power Automate , M Language , R Programming. Advance knowledge of Azure SQL DB & Synapse Analytics , Azure Data Bricks, Power BI Should be able to analyse and understand complex data. Knowledge of Azure data lake storage is required and Services like Azure Analysis Service & SQL Databases. Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Please find below job details : Role : Senior Associate - Tax Data Solution Consultants (Data Engineer) Experience : 4+ years Location : Gurgaon Mode : Hybrid Mandatory Skills : Alteryx, Azure Data Factory (ADF), PowerApps Job Description : Own the end-to-end development of scalable, secure, and high-performance data pipelines using Azure Data Factory. This includes sourcing data from various systems (on-premises and cloud), transforming it efficiently, handling large volumes, and ensuring data integrity across pipelines. Write highly optimized, production-grade SQL queries for complex data transformations, analysis, and reporting. You will work with relational databases (e.g., Azure SQL, SQL Server) to design schemas, write stored procedures, subqueries, and develop solutions that scale and perform reliably. Build business-focused applications, workflows, and dashboards using Power BI, Power Apps, and Power Automate. This includes automating manual processes and connecting to diverse data sources using connectors and APIs. Understand and work with data from ERP systems like SAP, Oracle, etc. Serve as a trusted advisor to clients by clearly articulating solution approaches, gathering evolving requirements, and presenting results. You will lead discussions, facilitate workshops, and ensure alignment between technical outputs and business needs Take full ownership of solution components—from idea to deployment. You’re expected to drive initiatives independently, troubleshoot issues, and lead development efforts without heavy supervision. You will manage timelines, identify risks, and ensure high quality delivery. Maintain documentation of data processes, system configurations, and any modifications made to automation workflows. Provide routine system checks and generate performance reports to ensure optimal platform functionality. Required Qualifications & Skills: Bachelor’s degree in information systems, engineering, or a related discipline. 4 – 7 years of hands-on experience with SQL. Confident writing complex queries, joins, and transformations. Hands-on experience with data management platforms and automation tools, using nocode solutions like Azure Data Factory or Alteryx. Able to build and manage end-to-end pipelines. Strong problem-solving skills with the ability to troubleshoot data issues effectively. Excellent communication and client liaison skills. Ability to work in a fast-paced environment and manage multiple tasks simultaneously. Attention to detail and a strong commitment to data accuracy and quality. Preferred Experience: Certifications in Power Platform, Azure, SQL, or Alteryx. Knowledge of additional Azure services (e.g., Synapse, Logic Apps). Familiarity with third party tax solutions and data integration processes is a plus. What We Offer: A vibrant, flexible work culture focused on innovation, excellence, and support. Opportunity to build solutions from scratch and make real impact. Opportunities for career development and professional training in advanced data and automation technologies. Competitive salary, benefits, and a supportive team environment. Interested candidates can share your updated CV at shivani.sah@promaynov.com Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
This role is for one of Weekday's clients Salary range: Rs 2600000 - Rs 2800000 (ie INR 26-28 LPA) Min Experience: 8 years Location: Mumbai JobType: full-time Requirements About the Role We are seeking a highly skilled and experienced Data Architect to lead the design, development, and optimization of our enterprise data infrastructure. This is a strategic role for an individual passionate about modern data platforms, cloud technologies, and data governance. The ideal candidate will bring deep expertise in Data Engineering , Azure Data Services , Databricks , Power BI , and ETL frameworks to drive scalable and secure data architecture for enterprise analytics and reporting. As a Data Architect, you will collaborate with cross-functional teams including business analysts, data scientists, and application developers to ensure the delivery of high-quality, actionable data solutions. Your work will directly influence data-driven decision-making and operational efficiency across the organization. Key Responsibilities Architect Scalable Data Solutions: Design and implement robust, secure, and scalable data architectures using Microsoft Azure, Databricks, and ADF for large-scale enterprise environments. Data Engineering Leadership: Provide technical leadership and mentoring to data engineering teams on ETL/ELT best practices, data pipeline development, and optimization. Cloud-Based Architecture: Build and optimize data lakes and data warehouses on Azure, leveraging Azure Data Lake, Synapse Analytics, and Azure SQL services. Databricks Expertise: Use Azure Databricks for distributed data processing, real-time analytics, and machine learning data pipelines. ETL Frameworks: Design and maintain ETL workflows using Azure Data Factory (ADF), ensuring efficient movement and transformation of data from multiple sources. Visualization & Reporting: Collaborate with business stakeholders to deliver intuitive and insightful dashboards and reports using Power BI. Data Governance & Quality: Enforce data quality standards, lineage, and governance across all data assets, ensuring compliance and accuracy. Collaboration & Integration: Work with application developers and DevOps teams to integrate data systems with other enterprise applications. Documentation & Standards: Maintain detailed architecture diagrams, data dictionaries, and standard operating procedures for all data systems. Required Skills & Qualifications 8+ years of experience in data engineering, data architecture, or related fields. Proven experience designing and implementing cloud-based data solutions using Microsoft Azure. Hands-on expertise in Azure Databricks, ADF, Azure SQL, Data Lake Storage, and Power BI. Strong proficiency in ETL/ELT development, pipeline orchestration, and performance optimization. Solid understanding of data modeling, warehousing concepts (Kimball/Inmon), and big data technologies. Proficiency in scripting languages such as Python, SQL, and Spark. Experience in managing data security, compliance, and governance in large enterprises. Strong problem-solving skills and a collaborative mindset. Preferred Qualifications Azure certifications such as Azure Data Engineer Associate or Azure Solutions Architect Expert. Experience with CI/CD pipelines for data workloads. Exposure to MDM (Master Data Management) and data catalog tools. Show more Show less
Posted 2 weeks ago
125.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche’s 125-year history as one of the world’s largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients’ needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we’re driven by a shared passion for technological novelties and optimal IT solutions. Position Overview We are seeking an experienced ETL Architect to design, develop, and optimize data extraction, transformation, and loading (ETL) solutions and to work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. This role requires deep expertise in Python, AWS Cloud, and ETL tools to build and maintain scalable data pipelines and architectures. The ETL Architect will work closely with cross-functional teams to ensure efficient data integration, storage, and accessibility for business intelligence and analytics. Key Responsibilities ETL Design & Development: Architect and implement high-performance ETL pipelines using AWS cloud services, Snowflake, and ETL tools such as Talend, Dbt, Informatica, ADF etc Data Architecture: Design and implement scalable, efficient, and cloud-native data architectures Data Integration & Flow: Ensure seamless data integration across multiple source systems, leveraging AWS Glue, Snowflake, and other ETL tools Performance Optimization: Monitor and tune ETL processes for performance, scalability, and cost-effectiveness Governance & Security: Establish and enforce data quality, governance, and security standards for ETL processes Collaboration: Work with data engineers, analysts, and business stakeholders to define data requirements and ensure effective solutions Documentation & Best Practices: Maintain comprehensive documentation and promote best practices for ETL development and data transformation Troubleshooting & Support: Diagnose and resolve performance issues, failures, and bottlenecks in ETL processes Required Qualifications Education: Bachelor's or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field Experience: 6+ years of experience in ETL development, with 3+ years in an ETL architecture role Expertise in Snowflake or any MPP data warehouse (including Snowflake data modeling, optimization, and security best practices) Strong experience with AWS Cloud services, especially AWS Glue, AWS Lambda, S3, Redshift, and IAM or Azure/GCP cloud services Proficiency in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or DataStage Strong SQL skills and experience with relational and NoSQL databases Experience in API integrations Proficiency in scripting languages (Python, Shell, PowerShell) for automation Prior experience in Pharmaceutical or Diagnostics or healthcare domain is a plus Soft Skills Strong analytical and problem-solving abilities Excellent communication and documentation skills Ability to work collaboratively in a fast-paced, cloud-first environment Preferred Qualifications Certifications in AWS, Snowflake, or ETL tools Experience in real-time data streaming, microservices-based architectures, and DevOps for data pipelines Knowledge of data governance, compliance (GDPR, HIPAA), and security best practices Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer. Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Hyderābād
On-site
We are looking for a highly self-motivated individual with Azure Data Engineering (Synapse, ADF, T-SQL) as a Technical Lead: Work Experience: Experience should have 8 Years to 12 Years of Azure Data Engineering. 7+ years of experience in ETL and Data Warehousing development. Experience with data modelling and ETL. Strong hands-on experience in ETL process preferably Microsoft technologies. Knowledge and experience in Azure Synapse Analytics, Azure Synapse DWH, building pipelines in AZURE Synapse platform and TSQL and Azure Data Factory. Experience in data warehouse design and maintenance is plus. Experience in agile development processes using Jira and Confluence. Understanding on the SDLC. Understanding on the Agile methodologies. Communication with customer and producing the Daily status report. Should have good oral and written communication. Should be proactive and adaptive. Skills and languages: Proficiency in written and spoken English; working knowledge of another UN language would be an asset. Expected Deliverables: Candidate should help IST in data extraction process from ERP to Datawarehouse. Strong knowledge on TSQL and writing Stored procedures. Should help building data models in different area of ERP based data sets. Candidate should build pipeline to establish integration of data flow among different sources of IT applications.
Posted 2 weeks ago
0 years
3 - 6 Lacs
Gurgaon
On-site
#freepost Designation: Middleware Administrator (L1) Location: Gurugram Qualification: B.E. / B. Tech/BCA Roles and Responsability Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. Job Type: Full-time Work Location: In person
Posted 2 weeks ago
3.0 years
6 - 8 Lacs
Chennai
Remote
Job Title: Azure Data Engineer Experience: 3+ years Location : Remote Job Description : 3+ years of experience as a Data Engineer with strong Azure expertise Proficiency in Azure Data Factory (ADF) and Azure Blob Storage Working knowledge of SQL and data modeling principles Experience working with REST APIs for data integration Hands-on experience with Snowflake data warehouse Exposure to GitHub and Azure DevOps for CI/CD and version control Understanding of DevOps concepts as applied to data workflows Azure certification (e.g., DP-203) is highly desirable Strong problem-solving and communication skills Speak with Employer: Mobile Number: 96293 11599 Email-ID: aswini.u@applogiq.org Job Types: Full-time, Permanent Benefits: Health insurance Schedule: Day shift Work Location: In person
Posted 2 weeks ago
3.0 years
12 Lacs
India
Remote
Dear Candidates, Greetings from ADAPTON..!!! Role : Azure Data Engineer Working Timing : 01.00 PM to 10.00 PM (Hybrid) Working Days : 5 days of work Salary : As per your Current CTC Work Location : Guindy, Chennai Job Summary: We are seeking a skilled Azure Data Engineer to design, implement, and optimize data solutions on Microsoft Azure. The ideal candidate will have strong expertise in Azure data services, data pipelines, and data integration while ensuring high availability, performance, and security. Key Responsibilities: Design and develop data pipelines using Azure Data Factory (ADF) and Azure Databricks. Implement data storage solutions using Azure Data Lake, Azure SQL Database, and Synapse Analytics. Optimize and manage ETL/ELT processes to ensure data integrity and performance. Build and maintain data models, ensuring scalability and efficiency. Work with Azure Stream Analytics for real-time data processing. Implement and manage Azure Blob Storage, Cosmos DB, and other Azure data services. Develop and maintain Azure SQL, Synapse Analytics, and Power BI dashboards as needed. Ensure data security, compliance, and governance (GDPR, HIPAA, etc.). Collaborate with data scientists, analysts, and business teams to provide data solutions. Monitor and troubleshoot Azure data services for performance and reliability. Required Skills & Qualifications: 3+ years of experience in Azure data engineering. Strong knowledge of Azure Data Factory, Azure Synapse, Azure Data Lake, and Databricks. Proficiency in SQL, Python, or Scala for data processing. Experience with ETL/ELT frameworks and data transformation techniques. Understanding of Azure DevOps, CI/CD, and Infrastructure as Code (Terraform, Bicep). Knowledge of big data technologies (Spark, Delta Lake, etc.). Experience with real-time and batch processing solutions. Strong problem-solving skills and ability to optimize query performance. Perks & Benefits: Competitive Salary | Flexible work environment (Hybrid options) | Career Growth -- Thanks and Regards, Prathipa V Senior HR prathipahr03@gmail.com Job Types: Full-time, Permanent Pay: Up to ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Work from home Schedule: Day shift Monday to Friday Morning shift Supplemental Pay: Yearly bonus Education: Bachelor's (Required) Experience: ADF: 3 years (Required) Data bricks: 3 years (Required) Data lake: 3 years (Required) Work Location: In person
Posted 2 weeks ago
0 years
9 - 10 Lacs
Bengaluru
On-site
Location: Bengaluru, KA, IN Company: ExxonMobil About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Support in developing and owning ETL pipelines within cloud data platforms Data extraction and transformation pipeline Automation using Python/Airflow/Azure Data Factory/Qlik/Fivetran Delivery of task monitoring and notification system for data pipeline status Supporting data cleansing, enrichment, and curation / enrichment activities to enable ongoing business use cases Developing and delivering data pipelines through a CI/CD delivery methodology Developing monitoring around pipelines to ensure uptime of data flows Optimization and refinement of current queries against Snowflake Working with Snowflake, MSSQL, Postgres, Oracle, Azure SQL, and other relational databases Work with different cloud databases such as Azure SQL, Azure PostgreSQL, Etc. Working with Change-Data-Capture ETL software to populate Snowflake such as Qlik and Fivetran Identification and remediation of failed and long running queries Development of large aggregate queries across a multitude of schemas About You Skills and Qualifications Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Preferred Qualifications/ Experience Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Job Segment: Sustainability, Cloud, Database, SQL, Embedded, Energy, Technology
Posted 2 weeks ago
0 years
5 - 8 Lacs
Bengaluru
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Experience in Microsoft SQL Server database development (TSQL) Experience in building SSIS packages Good experience in creation LLD’s.Experience delivering solutions utilizing the entire Microsoft BI stack (SSAS, SSIS) Experience with SQL Server/T-SQL programming in creation and optimization of stored procedures, triggers and user defined functions Microsoft SQL Server database development (TSQL) experience Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts Must be able to build Business Intelligence solutions in a collaborative, agile development environment Strong understanding of Data Ingestion, Data processing, Orchestration, Parallelization, Transformation and ETL fundamentals Sound knowledge of data analysis using any SQL tools Experience in ADF, Synapse and other Azure components Designs develop, automates, and support complex applications to extract, transform, and load data Should have knowledge of error handling and Performance tuning for Data pipelines Skill: SQL, SSIS, ADF, T-SQL, ETL & DW, Good Communication Qualifications Graduate
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills: Technology->AWS->Devops Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Cloud Platform->AWS Database, Technology->Cloud Platform->Azure Devops->Azure Pipelines, Technology->DevOps->Continuous integration - Mainframe A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
DATA ENGINEER II 3+ years of experience in building data Pipelines with Python/PySpark Professional experience in Azure ETL stack ( eg. ADLS, ADF, ADB, ASQL/Synapse) 3+ years of experience with SQL Proficient understanding of code versioning tools such as Git and PM tool like Jira Show more Show less
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.