Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Senior Engineer, Data Modeling Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Manage Azure Infrastructure: Optimize the Azure cloud environment for high availability, scalability, and performance. Administer Azure Services: Manage Azure Portal, Entra ID, and authentication mechanisms. Configure Azure PaaS Services: Work with services such as App Service, Function Apps, Azure SQL, CosmosDB, and Azure Service Bus (ASB). Azure Storage and Key Vault: Implement and manage Azure Storage solutions and Key Vault for secure data handling. API Management: Oversee Azure API Management (APIM), including managing the Developer Portal, Postman/Bruno, and policy configuration. Monitoring and Performance: Set up App Insights, Log Analytics, and monitoring tools for performance optimization and troubleshooting. Governance & Compliance: Enforce Azure Policies to ensure governance and maintain compliance. Secure Application Onboarding: Assist with secure and compliant application onboarding to Azure. IaC Development: Develop and maintain Terraform, ARM templates, and PowerShell scripts for automating infrastructure provisioning. Cloud Deployment Automation: Use Azure CLI, PowerShell, and templates for automating cloud infrastructure deployments. Compliance & Governance Automation: Ensure adherence to Azure policies and governance frameworks via automated solutions. Configure Network Components: Set up and maintain VNets, Subnets, Private Endpoints, and Express Routes. Security Implementations: Implement firewalls, NSGs, UDRs, and DNS configurations to secure the Azure environment. Route Optimization & Secure Connectivity: Optimize global route management and secure connectivity. Authentication & Certificate Management: Manage authentication, authorization, and the lifecycle of certificates to ensure secure cloud operations. Use Kusto Query Language (KQL): Leverage KQL for log analysis and troubleshooting. Optimize PaaS Resources: Optimize Azure PaaS services for cost, scalability, and performance. Certificate & Key Management: Handle platform-level management of certificates and keys (e.g., CMK, development certificates). Collaborate with Teams: Work closely with DevOps and application teams to ensure smooth deployments and operations. Experience: 8–10 years in Azure Cloud Engineering or similar roles, with deep expertise in cloud infrastructure and Azure services. Cloud Fundamentals: Strong foundation in cloud networking, security, storage, and authentication. Azure PaaS Expertise: Proficient in working with Azure PaaS services like App Services, ASE, ADF, APIM, ASB, CosmosDB, and Azure SQL. Automation Skills: Experience with Terraform, ARM templates, and PowerShell scripting for IaC. Azure CLI & Policy Enforcement: Hands-on experience with Azure CLI, policy enforcement, and cloud automation. Networking Expertise: Skilled in Azure networking components like VNets, NSGs, Private Endpoints, and Express Routes. Troubleshooting Expertise: Proficient in troubleshooting with tools like App Insights, Log Analytics, and KQL. Cloud Security & Certificate Management: Familiarity with certificate management and best practices for cloud security. Skills Azure PaaS and IaC Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TATA Consultancy Services Job Openings at TCS Skill :AZURE DATA ENGINEER Exp range :4+10 YEARS Role : Permanent Role Job location :Pune Current location : Pune Mode of Interview: FACE TO FACE(WALKIN)INTERVIEW on 31st may 25 saturday AT HINJEWADI PHASE 3 PUNE Pls find the Job Description below. Keywords For Search Azure Databricks, Python, ADF, Data Engineer, Synapse Job Description Role and Responsibilities Senior Data Engineer 5+ years of total IT experience Minimum 5+ years of development experience in Azure Must have “Data Warehouse / Data Lake” development experience. Must have “Azure Data Factory (ADF) & Azure SQL DB” Must have “Azure Data Bricks” experience using Python or Spark or Scala Nice to have “Data Modelling” & “Azure Synapse” experience. Nice to Azure ML experience Nice to have “PowerBI” experience. Nice to have Azure Data Engineer Certifications Passion for Data Quality with an ability to integrate these capabilities into the deliverables. Prior use of Big Data components and the ability to rationalize and align their fit for a business case. Experience in working with different data sources - flat files, XML, JSON, Avro files and databases. Knowledge of Jenkins for continuous integration and End-to-End automation for application build and deployments. Ability to integrate into a project team environment and contribute to project planning activities. Experience in developing implementation plans and schedules and preparing documentation for the jobs according to the business requirements. Lead ambiguous and complex situations to clear measurable plans. Proven experience and ability to work with people across the organization and skilled at managing cross-functional relationships and communicating with leadership across multiple organizations. Proven capabilities for strong written and oral communication skill with the ability to synthesize, simplify and explain complex problems to different audiences. Thanks & Regards Priyanka Talent Acquisition Group Tata Consultancy Services Show more Show less
Posted 3 weeks ago
6.0 - 9.0 years
40 - 45 Lacs
Pune
Work from Office
6+ years of experience in data engineering with a focus on Azure cloud technologies. Strong expertise in Azure Data Factory, Databricks, ADLS, and Power BI. Proficiency in SQL, Python, and Spark for data processing and transformation. Experience with IoT data ingestion and processing, handling high-volume, real-time data streams. Strong understanding of data modeling, Lakehouse architecture, and medallion frameworks. Experience in building and optimizing scalable ETL/ELT processes. Knowledge of data governance, security, and compliance frameworks. Experience with monitoring, logging, and performance tuning of data workflows. Strong problem-solving and analytical skills with a platform-first mindset.
Posted 3 weeks ago
5.0 years
0 Lacs
New Delhi, Delhi, India
On-site
TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less
Posted 3 weeks ago
4.0 - 9.0 years
6 - 14 Lacs
Hyderabad
Remote
Job description Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Preferred: Hyderabad At least 4+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flow Experience in business processing mapping of data and analytics solutions
Posted 3 weeks ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Job Description: Customize and configure Oracle Fusion modules as per business requirements. Develop and modify reports (BIP, OTBI, FRS, Hyperion Smart View), interfaces, extensions (Page Composer, Application Composer (With Groovy Scripting), Process Composer), and workflows (Oracle BPM, AMX), Forms (ADF (Java Based)), VBCS and Page Customization to enhance functionality. Integrate Oracle Fusion applications with other business systems and third-party applications. (Oracle Integration Cloud) Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What is Blend Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com What is the Role As a Senior Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products !You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! What you’ll be doing? Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results What do we need from you? 5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes What do you get in return? Competitive Salary : Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the fut ur e. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your ski lls. Snack Zone : Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flow i ng. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward prog r am. Fuel Your Growth Journey with Certifications : We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certificat i ons. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Position: Enterprises Architect //Solution Architect// Cloud Pre-Sales Solution Architect. Job Type: -Permanent Location: Pune Experience: 10+ Years Role Responsibilities Roles and Responsibilities To design develop and implement solutions to complex business problems collaborating with stakeholders to understand their needs and requirements and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery driving consistency A seasoned IT Professional with + years of IT experience and currently, working as a Solution Architect Technologies COE/Practice Team understanding business requirements and designing solutions based on Multi clouds Azure, AWS, Google AWS Certified Associate/Professional Architect. As AWS Architect for Belron-Safelite - 80 Applications target designs as per AS-IS designs and signed off by presenting the target designs with stakeholders. Won the appreciations from the customer majorly on networking part for using the different AWS networking services. As AWS Architect lead the team of developers to configure the infra using the CFT. Using the AWS CFT to spin the EKS cluster for deployment the critical application Azure Cloud T&T - 21 different Azure Data Analytics services TDD got approved from customer and implemented e.g. ADF, Event Grid, Event Hub, Synapsys etc. Provisioning different Azure Analytics services for customers using IaC tool. Transitions 8-9 Fortune 500 customers in Azure, AWS and Google Cloud. Implemented Cloudera Big Data Hadoop-Anaconda S/W for Malaysia biggest financial customer. Transition Lead - Big Data Hadoop cluster 220 nodes, USA one of the biggest financial customers Managing Azure Landing Zone implementation EU customer. Engaged with GTM/SALE/PRE-Sale Team for technical expertise. Azure Cloud strategy consultant. Lead the T&T (Transitions & Transformations) for the Azure PaaS Services across the globe customers. Responsible to build the Big Data Hadoop Practice Team. Expert in Azure PaaS Data & Analytics Services. Involved in propositions , Pre-Sale, TDD, HLD, LLD, Solutioning & Designing, Architecture, Effort Estimation, RFP/RFI, T&T (Transition & Transformation) and Core Member of Interview Panel for Big Data Analytics and Clouds technologies. Lead the Team to Implement the Different Azure PaaS Data & Analytics Services Rich experience in preparing the deployment plan for different Azure PaaS service and get approval from Customer to provisioning the services. Worked closely with IaC Team to execute the deployment plan. Rich technical experience in architecting, designing & implementing Cloud based Data Platform & Analytics Services . Currently spearheading the delivery of Azure Data Lake and Modern Data Warehouse Solutions. Developing solutions, planning, creating & delivering compelling proof of concept demonstrations. Possess professional IT Delivery Management experience with strong work ethics, approachability and consistent commitment to the team leadership and innovation. Responsible for driving teamwork, communication, collaboration, and commitment within I.T. and teams. Providing and implementing suggestions on Cost Optimization on Client Infra. Working on various Microsoft Azure services such as Azure Virtual Machines, Azure Networking, Azure Storage, Azure Migrate, Azure DevOps, Azure Data Lake, Azure Synapse Analytics, Azure Stream Analytics, Azure Data Bricks, Azure Backup and Azure Active Directory. Configuring the Azure Firewall, Application Gateway with WAF, load balancers and traffic manager to manage security of the workload virtual network. Managing and implementing roles, users, groups, RBAC, MFA, Conditional Access Policies in Azure AD. Working various DevOps tools such as Repo, Dashboard, GitHub for version control, containers- Dockers and Kubernetes . Managing pods, Replica Sets, deployments, services in a Kubernetes cluster. Building POC environment in Google and IBM Cloud. Provisioning different resources/resource groups via Terraform. Worked as Mainframe Consultant with Tech Mahindra (Satyam Computers Ltd. ) for EU Clients to Implement Change man/Vision Plus. Expertise in Troubleshooting production issues, log analysis, performance monitoring . Excellent knowledge of ITIL processes like Incident, Problem, Change, Release, Availability management. Worked on various Service Management tools like Remedy/ServiceNow- Problem and Incident management tools. Responsible for Transition and Transformation of Hadoop Projects. Responsible for the various Big Data Analytics and Cloud Propositions. Big Data Hadoop with Global biggest Financial Customers: - Hadoop| HBase | Hive| Looker| Neo4j| OpenShift| Kubernetes| Docker| Rundeck| Prometheus| AWS| Azure| Shell| Python| Architect| Implementation| Troubleshooting| Solution Show more Show less
Posted 3 weeks ago
12.0 - 20.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Experience: 12 to 20 Years Only Job Location: Kolkata / Hyderabad Only TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Required Technical Skill Set: Experience: Minimum of 8 years in data architecture or 10 years in engineering roles related to hybrid / multi cloud solutions, with a proven track record of successful client engagements. Technical Skills: Proficiency in two or more cloud platforms (AWS, Azure, GCP) related data services - Redshift, BigQuery, Synapse, Databricks, ADF, Glue, AWS EMR, Azure Insights, GCP Data Proc, GCP DataFlow etc., Experience in architecting applications leveraging containerization (Docker, Kubernetes), Cloud native (IaaS, PaaS, SaaS), Hybrid / Multi-Cloud, Hardware OEMs, Network, Security, Microservices, FinOps, iPaaS and APIs, Infrastructure as Code (IaC) tools (Terraform, CloudFormation), and CI/CD pipelines. Strong knowledge of enterprise architecture principles. Communication Skills: Excellent communication abilities to engage effectively with both technical and non-technical stakeholders to articulate technical concepts. Desirable Skill Set: Knowledge of any specific industry verticals (e.g., BFSI, Healthcare, Manufacturing, Telecom / Media). Technical certifications related to cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect Expert). Relevant cloud certifications (e.g., AWS Certified Solutions Architect) are preferred; must obtain certification within 90 days of employment. Understanding of DevOps concepts. Ability to lead cross-functional teams effectively. Key Responsibilities: Strategy & Design: Develop a comprehensive data strategy on Multi / Hybrid Cloud scenarios aligned with business goals. Design scalable, secure, and cost-effective Data solutions. Evaluate and select cloud platforms (AWS, Azure, GCP, OCI, IBM, Nutanix, Neo Cloud, etc.) and third-party tools. Develop blueprint, roadmap and drive implementation of data architecture, framework related epics / user stories Data modeling based on the business use cases Solution Design: Design data ingestion layer and data movement from ingestion layer to operational / analytical layers Design of data consumption layer (visualization, Analytics, AI/ML, outbound data) Design data governance track – framework design for data quality, data security, metadata etc., Architect tailored cloud solutions that leverage best practices and meet specific client requirements, utilizing native data services such as AWS, Azure, Google Cloud services Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on-premise technologies Good knowledge of any RBDMS/NoSQL database with strong SQL writing skills Good understanding of ML and AI concepts and Propose solutions to automate the process Technical Presentations: Conduct workshops and presentations to demonstrate solution feasibility and value, fostering trust and engagement with stakeholders. Proof of Concept (POC): Lead the design and implementation of POCs to validate proposed solutions, products against features & cost. Implementation & Management: Guide technical solution development in engagements related to Legacy modernization, migration of applications and infrastructure to hybrid cloud, Engineered cloud, etc. Guide, mentor the data development squads and review the deliverables, as required. Kind Regards, Priyankha M Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Role- azure data engineer Experience- 8-10yrs Location- Kolkata Must-Have** ETL, Azure Data Factory, SSRS, MS Fabric, Python, PowerShell SN Responsibility of / Expectations from the Role 1 Azure Data Engineer 2 Develop full SDLC project plans to implement ETL solution and identify resource requirements, Good Knowledge of SQL server complex queries, joins, etc. 3 Rest API, ADF pipeline, MS Fabric 4 SSIS and Azure Data Factory based ETL architecture. 5 Good exposure in Client Communication and supporting requests from customer Show more Show less
Posted 3 weeks ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: Data Engineer Location – Bangalore Type: Fulltime Experience - 10 + yrs Notice:-Immediate Job Description – Data Engineer (Azure, ADF, Databricks, PySpark, SCD, Unity Catalog, SQL) Role Overview: Looking for highly skilled experienced Data Engineer with expertise in Azure Data Factory (ADF), Azure Databricks, Delta Tables, Unity Catalog, Slowly Changing Dimension Type 2 (SCD2), and PySpark. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and ETL workflows while ensuring data integrity, scalability, and security within the Azure ecosystem. Key Responsibilities: Develop and optimize data pipelines using Azure Data Factory (ADF) and Azure Databricks for large-scale data processing. Implement Slowly Changing Dimension in Delta Tables to manage historical data changes effectively. Leverage Unity Catalog for secure and organized data governance, cataloging, and access control across Databricks. Write efficient PySpark code to process and transform large datasets, ensuring high performance and scalability. Design and implement ETL/ELT solutions to integrate data from multiple sources into Delta Lake. Monitor, debug, and optimize existing data pipelines to ensure smooth operations and minimal downtime. Ensure data quality, consistency, and lineage tracking through best practices and automation. Collaborate with data architects, analysts, and business teams to define requirements and implement data-driven solutions. Required Skills & Qualifications: 6+ years of experience in Data Engineering with a focus on Azure technologies. Expertise in Azure Data Factory (ADF) & Azure Databricks for ETL/ELT workflows. Strong knowledge of Delta Tables & Unity Catalog for efficient data storage and management. Experience with Slowly Changing Dimensions (SCD2) implementation in Delta Lake. Proficiency in PySpark for large-scale data processing & transformation. Hands-on experience with SQL & performance tuning for data pipelines. Understanding of data governance, security, and compliance best practices in Azure. Knowledge of CI/CD, DevOps practices for data pipeline automation. Preferred Qualifications: Experience with Azure Synapse Analytics, Data Lakes, and Power BI integration . Knowledge of Kafka or Event Hub for real-time data ingestion. Certifications in Microsoft Azure (DP-203, DP-900) or Databricks are a plus. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Greetings from TCS. Role : Azure Data Engineer Experience : 8 - 12 yrs WORK Location: Noida/Bangalore/Kolkata/Pune/Mumbai/Hyderabad Interview Mode: Virtual (MS Teams) Job Description: 1.Lead Back-end development & maintenance of Data Quality Product 2. Design and develop Data Pipeline using ADF, Databricks and integrate with other Azure Services 3. Experience in setting up DevOps pipeline 4. Databricks notebooks/Python Programming skills. 5. Knowledge of RDMS databases like SQL, Azure SQL Good-to-Have 1. Able to take the lead in debugging and resolving infrastructure and engineering issues 2. Experience in Azure Cloud Build and Automation using ARM template 3. Good Communication Skills If interested, please share your contact number and updated CV through DM , further details will be shared over telephonic discussion. Show more Show less
Posted 3 weeks ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
1 Role Data Engineer- Required Technical Skill Set SQL, Python, Hadoop, Spark, Azure Data Factory, Azure Data Lake Storage, Azure/GCP, Snowflake, Airflow, Data pipelines, Jenkins / Jira / Git, CICD, Kubernetes / Docker Desired Experience Range- 5+ 8 Year Location of Requirement- Bangalore, Chennai, Delhi, kochi, hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have · Strong in SQL, Python, Hadoop, Spark · Experience with cloud platforms (GCP/Azure/AWS) · Experience working in Agile delivery environment. · Experience with orchestration tools like Airflow, ADF. · Experience with real-time and streaming technology (i.e. Azure Event Hubs, Azure Functions Kafka, Spark Streaming). · Experience building automated data pipelines. · Experience performing data analysis and data exploration. · Experience working in multi-developer environment, using version control like Git. · Strong critical thinking, communication, and problem-solving skills. Good-to-Have · Understanding DevOps best practice and CICD · Understanding of containerization (i.e. Kubernetes, Docker) · Healthcare Domain knowledge Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle human capital at PwC, you will specialise in providing consulting services for Oracle human capital management (HCM) applications. You will analyse client requirements, implement HCM software solutions, and provide training and support for seamless integration and utilisation of Oracle HCM applications. Working in this area, you will enable clients to optimise their human resources processes, enhance talent management, and achieve their strategic objectives. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance . *Responsibilities: Participate in the implementation of Oracle HCM Cloud modules such as Core HR, Payroll, Benefits, Talent Management, Compensation, and others. Configure Oracle HCM Cloud applications to meet client requirements. Develop and customize reports using Oracle BI Publisher, OTBI, and other reporting tools. Create and modify HCM extracts, HDL (HCM Data Loader) scripts, and other data integration processes. Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. * Mandatory skill sets Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. Modules: Absence, Time and Labour, Payroll, workforce planning, HR helpdesk, Oracle digital assistants, Oracle guided learning *Preferred skill sets Provide technical support and troubleshooting for Oracle HCM Cloud applications. Perform routine maintenance and upgrades to ensure optimal performance of the HCM system. *Years of experience required 2 - 4 Yrs experience *Education Qualification BE/BTech /MBA/MCA/C As Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Absence Management, Absence Management, Accepting Feedback, Active Listening, Benefits Administration, Business Analysis, Business Process Improvement, Change Management, Communication, Emotional Regulation, Empathy, Employee Engagement Strategies, Employee Engagement Surveys, Employee Relations Investigations, Human Capital Management, Human Resources (HR) Consulting, Human Resources (HR) Metrics, Human Resources (HR) Policies, Human Resources (HR) Project Management, Human Resources (HR) Transformation, Human Resources Management (HRM), Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF) {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle human capital at PwC, you will specialise in providing consulting services for Oracle human capital management (HCM) applications. You will analyse client requirements, implement HCM software solutions, and provide training and support for seamless integration and utilisation of Oracle HCM applications. Working in this area, you will enable clients to optimise their human resources processes, enhance talent management, and achieve their strategic objectives. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance . *Responsibilities: Participate in the implementation of Oracle HCM Cloud modules such as Core HR, Payroll, Benefits, Talent Management, Compensation, and others. Configure Oracle HCM Cloud applications to meet client requirements. Develop and customize reports using Oracle BI Publisher, OTBI, and other reporting tools. Create and modify HCM extracts, HDL (HCM Data Loader) scripts, and other data integration processes. Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. * Mandatory skill sets Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. Modules: Absence, Time and Labour, Payroll, workforce planning, HR helpdesk, Oracle digital assistants, Oracle guided learning *Preferred skill sets Provide technical support and troubleshooting for Oracle HCM Cloud applications. Perform routine maintenance and upgrades to ensure optimal performance of the HCM system. *Years of experience required 2 - 4 Yrs experience *Education Qualification BE/BTech /MBA/MCA/C As Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Chartered Accountant Diploma, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Absence Management, Absence Management, Accepting Feedback, Active Listening, Benefits Administration, Business Analysis, Business Process Improvement, Change Management, Communication, Emotional Regulation, Empathy, Employee Engagement Strategies, Employee Engagement Surveys, Employee Relations Investigations, Human Capital Management, Human Resources (HR) Consulting, Human Resources (HR) Metrics, Human Resources (HR) Policies, Human Resources (HR) Project Management, Human Resources (HR) Transformation, Human Resources Management (HRM), Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF) {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 3 weeks ago
9.0 - 14.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Design, deploy, and optimize Azure-based data pipelines and architectures. Ensure scalability, data integrity, and CI/CD automation. Collaborate with analytics teams and lead data engineering initiatives across hybrid data platforms Required Candidate profile Bachelor’s in CS/IT with 7–12 years of experience in Azure data engineering. Strong in ADF, Synapse, Databricks, and CI/CD. Able to mentor junior engineers, optimize large-scale data systems
Posted 3 weeks ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc. As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of software applications or operating systems. Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience. Career Level - IC4 Responsibilities Job Description: The Applications Technology Group is responsible for building the Framework & Technology foundation for Oracle E-Business Suite. Life Cycle Management falls under the Application Technology Group & this group is responsible developing the products which would be responsible Installing, Upgrading, Maintaining, Monitoring, Bundling the technology stack, cloning etc . This group carries forward the same responsibility in deploying the Oracle E-Business Suite in Oracle Cloud Infrastructure as well. RESPONSIBILITIES: This position mainly involves developing products which automates the deployment of Oracle E-Business Suite in Oracle Cloud Infrastructure. Deliver configuration level products for Oracle E-Business Suite to use Oracle Application Server with and Oracle Databases. Work closely with other lines of business including Applications, Application Server, and Database to identify and implement solutions for challenging technical problems. Give to the definition of standard practices and procedures for software development. Recommend and explain major changes to existing products, services and processes. This is purely a individual contributor role QUALIFICATIONS: Mandatory Skill Sets: Java back-end programming Experience with any of the Java based UI frameworks (JET, ADF, OAFWK, UIX, SPRING UI framework ) PL/SQL programming PERL, Unix Shell scripting Excellent analytical and problem solving skills. Self-motivated, drive, great teammate and results oriented Experience in the following would be helpful: Experience in developing System Administration tools Other Scripting languages like Python, Ruby Experience & exposure to Automation platforms like Chef & Puppet is helpful Experience in automating the deployment using the infrastructure provided by different Cloud vendors. Experience in some or all of the following Oracle technologies ? Oracle Application Server, Oracle Web Logic Server, Oracle Database or experience with Knowledge of Oracle E-Business Suite Academics Any Of The Following Qualifications With Excellent Academic Credentials. BE or MS degree in computer science or equivalent M.C.A Experience: Looking for a minimum of 6+ years of experience Design, develop, fix and debug software programs for databases, applications, tools, networks etc. Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that encourages thought leadership and innovation. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to build the future we need talent from various backgrounds, perspectives, and abilities About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Profile We are looking for sophisticated and forward-looking Full Stack Jr. Engineer - Oracle Cloud OICS/VBCS/JAVA to join our engineering team. The successful candidate will be responsible for leading Oracle Cloud Fusion, OICS, VBCS, ADF and JAVA development, including the setup, configuration, and management, as well as coordinating OCI support. The candidate must full grasping the end-to-end configuration, technical dependencies, and overall behavioural characteristics of large-scale implementation of Cloud Fusion. Responsibility includes development and implementation of critical test cases with focus on security, resiliency, scale, and performance. Partner with development teams and work towards addressing and fixing production issues on cloud, defining and implementing product improvements. Collaborate with various cloud operations teams to understand the production issues and work towards to build a reproducible test case in the lab environment to present to the development team. Background LSEG have embarked on a Finance Transformation programme to deliver our Finance Vision and transform the way we work to build value and deliver balanced growth for the business. The Programme is driving efficiencies and max improving benefits for LSEG by moving to a single cloud-based Enterprise Resource Planning and Performance Management (Oracle Fusion/EPMCS/EDMCS/OICS/ ORMBCS). As a Full Stack Jr. Engineer - Oracle Cloud Fusion OICS, VBCS, ADF & JAVA will be responsible for: Business Requirements Analysis: Collaborate with business collaborators to elicit, analyze, and detailt comprehensive reporting requirements, translating them into technical specifications. Provided solutions as per business requirements, project plan estimations Integration Design and Development: Design, develop, and implement sophisticated integrations between Oracle Cloud applications and external systems using Oracle Integration Cloud Service (OICS), SQL, PL/SQL, Python, ADF, Java and VBCS. Integration Architecture: Define and implement integration architectures that align with business requirements, ensuring scalability, performance, and security. API Development: Build and handle APIs using OICS to expose data and functionality from various systems, enabling detailed integration. Data Mapping and Transformation: Design and implement data mapping rules and transformations to ensure accurate data flow between systems. Testing and Quality Assurance: Develop and complete comprehensive test plans to validate the accuracy, performance, and reliability of integrations, identifying and addressing any issues. Deployment and Support: Deploy integrations to production environments and provide ongoing support, solving issues and implementing improvements as needed. Documentation and Knowledge Transfer: Create clear and concise documentation for integrations, including design specifications, user guides, and maintenance procedures. Support and Maintenance: Provide ongoing support for existing reports, addressing user inquiries, solving issues, and implementing enhancements as needed to maintain report effectiveness. Handle and work with Oracle support, consulting teams and should have a thorough understanding of the process that is involved taking care of Oracle SR’s. Work independently and tackle problems and are willing to do what it takes to get things done. Ability to establish relationships and influence outside of authority while demonstrating Oracle expertise and resources. Effective interpersonal skills (written and spoken) and strong problem-solving skills Ability to work in a fast-paced Agile development and rapid deployment Strong inclination towards test driven development Extensive experience of Oracle Cloud and EBS R12 /RMB Architecture Hands on Knowledge on Cloud at Customer (GEN I, GEN II) or similar PaaS/IaaS experience Phenomenal teammate, able to work with different levels in the organization across multiple time zones. Naturally inquisitive, able to think creatively and offer solutions or alternative viewpoints. Knowledge/Skills: Solid understanding in Oracle Cloud Fusion, OICS, REST services, SQL, PL/SQL, Python, Java and VBCS. Finance business process Knowledge with R2R, P2P, O2C and Oracle RMBCS. Data modelling, designed and implemented ETL interfaces/mappings to load data into DWH as per business and reporting requirements Agile, DevOps, SDLC industry standard process/methods. Certifications: Relevant Oracle Cloud Fusion certifications, such as Oracle Cloud Infrastructure (OCI) certifications or Oracle Fusion Applications certifications. Industry Expertise: Specific confirmed experience in areas such as finance, human capital management, supply chain, or enterprise resource planning. Skill of leading virtual teams, partner across interpersonal boundaries and influence development direction and Ability to empower, grow and guide developers. Experience One Full Life Cycle Implementation of OICS, REST services, Java, ADF, SOA, SQL, PL/SQL, Python, Java and VBCS, Oracle Database ADW Applications Seeded Customization Framework, OATs. PSR Tools, Accessibility Testing Tools preferred for Oracle Cloud Fusion ERP. Experience working within a Finance Technology function, preferably with large Financial Services Organization. Experience in driving high availability, resiliency, and scalability of a global Oracle ERP Cloud Fusion Applications landscape delivering continuous improvements. Experience in continuous delivery, deployment and monitoring of cloud-based services. Ability to work with multi-functional Directors and global leads Degree or equivalent experience in Computer Science, Software Engineering. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less
Posted 3 weeks ago
13.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Director Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the role: As a Director, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required skills & experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory skill sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred skill sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years of experience required: 13+ years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Mandatory Skill Sets Technical Delivery Preferred Skill Sets Technical Delivery Years Of Experience Required 13 - 20 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Technical Delivery Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 24 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Accountabilities JOB DESCRIPTION The Azure Data Support engineer focuses on data-related tasks in Azure. Manage, monitor, and ensure the security and privacy of data to satisfy business needs. Monitor real time and batch processes to ensure data accuracy. Monitor azure pipelines and troubleshoot where required. Enhance existing pipelines, databricks notebooks as and when required. Involved in development stages of new pipelines as and when required. Troubleshoot pipelines, real time replication jobs and ensuring minimum data lag. Available to work on a shift basis to cover monitoring during weekends. (one weekend out of three) Act as an ambassador for DP World at all times when working; promoting and demonstrating positive behaviours in harmony with DP World’s Principles, values and culture; ensuring the highest level of safety is applied in all activities; understanding and following DP World’s Code of Conduct and Ethics policies Perform other related duties as assigned JOB CONTEXT Responsible for monitoring and enhancing existing data pipelines using Microsoft Stack. Responsible for enhancement of existing data platforms. Experience with Cloud Platforms such as Azure, AWS , Google Cloud etc. Experience with Azure data Lake, Azure datalake Analytics, Azure SQL Database, Azure, Data Bricks and Azure SQL Data warehouse. Good understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks. Good understanding of Big Data Hadoop and Yarn architecture along with various Hadoop Demons such as Job Tracker, Task Tracker, Name Node, Data Node, Resource/Cluster Manager, and Kafka (distributed stream-processing) . Experience in Database Design and development with Business Intelligence using SQL Server 2014/2016, Integration Services (SSIS), DTS Packages, SQL Server Analysis Services (SSAS), DAX, OLAP Cubes, Star Schema and Snowflake Schema. Monitoring of pipelines in ADF and experience with Azure SQL, Blob storage, Azure SQL Data warehouse. Experience in a support environment working with real time data replication will be a plus. Qualification QUALIFICATIONS, EXPERIENCE AND SKILLS Bachelor/master’s in computer science/IT or equivalent. Azure certifications will be an added advantage (Certification in AZ-900 and/or AZ-204, AZ-303, AZ-304 or AZ-400 , DP200 & DP201). ITIL certification a plus. Experience : 5 - 8 Years Must Have Skills Azure Data lake, Data factory, Azure Databricks Azure SQL database, Azure SQL Datawarehouse. Hadoop ecosystem. Azure analytics services. Programming Python, R, Spark SQL Good To Have Skills MSBI (SSIS, SSAS, SSRS), Oracle, SQL, PL/SQL Data Visualization, Power BI Data Migration Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.